Face Generation

In this project, you'll define and train a DCGAN on a dataset of faces. Your goal is to get a generator network to generate new images of faces that look as realistic as possible!

The project will be broken down into a series of tasks from loading in data to defining and training adversarial networks. At the end of the notebook, you'll be able to visualize the results of your trained Generator to see how it performs; your generated samples should look like fairly realistic faces with small amounts of noise.

Get the Data

You'll be using the CelebFaces Attributes Dataset (CelebA) to train your adversarial networks.

This dataset is more complex than the number datasets (like MNIST or SVHN) you've been working with, and so, you should prepare to define deeper networks and train them for a longer time to get good results. It is suggested that you utilize a GPU for training.

Pre-processed Data

Since the project's main focus is on building the GANs, we've done some of the pre-processing for you. Each of the CelebA images has been cropped to remove parts of the image that don't include a face, then resized down to 64x64x3 NumPy images. Some sample data is show below.

If you are working locally, you can download this data by clicking here

This is a zip file that you'll need to extract in the home directory of this notebook for further loading and processing. After extracting the data, you should be left with a directory of data processed_celeba_small/

In [1]:
# # can comment out after executing
# !unzip processed_celeba_small.zip
In [2]:
data_dir = 'processed_celeba_small/'

"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
import pickle as pkl
import matplotlib.pyplot as plt
import numpy as np
import problem_unittests as tests
#import helper

%matplotlib inline

Visualize the CelebA Data

The CelebA dataset contains over 200,000 celebrity images with annotations. Since you're going to be generating faces, you won't need the annotations, you'll only need the images. Note that these are color images with 3 color channels (RGB)#RGB_Images) each.

Pre-process and Load the Data

Since the project's main focus is on building the GANs, we've done some of the pre-processing for you. Each of the CelebA images has been cropped to remove parts of the image that don't include a face, then resized down to 64x64x3 NumPy images. This pre-processed dataset is a smaller subset of the very large CelebA data.

There are a few other steps that you'll need to transform this data and create a DataLoader.

Exercise: Complete the following get_dataloader function, such that it satisfies these requirements:

  • Your images should be square, Tensor images of size image_size x image_size in the x and y dimension.
  • Your function should return a DataLoader that shuffles and batches these Tensor images.

ImageFolder

To create a dataset given a directory of images, it's recommended that you use PyTorch's ImageFolder wrapper, with a root directory processed_celeba_small/ and data transformation passed in.

In [3]:
# necessary imports
import os

import torch
from torchvision import datasets
from torchvision import transforms
from torch.utils.data import DataLoader
In [4]:
def get_dataloader(batch_size, image_size, data_dir='processed_celeba_small', image_type='celeba'):
    """
    Batch the neural network data using DataLoader
    :param batch_size: The size of each batch; the number of images in a batch
    :param img_size: The square size of the image data (x, y)
    :param data_dir: Directory where image data is located
    :return: DataLoader with batched data
    """
    
    # TODO: Implement function and return a dataloader
    
    # resize and normalize the images
    transform = transforms.Compose([transforms.Resize(image_size), # resize to the specified square size
                                   transforms.ToTensor()])
    
    # get the image directory
    image_path = './' + data_dir
    data_path = os.path.join(image_path, image_type)
    
    # define datasets using ImageFolder
    data_dataset = datasets.ImageFolder(data_path, transform)
    
    # create and return DataLoaders
    data_loader = DataLoader(dataset=data_dataset, batch_size=batch_size, shuffle=True)
    
    
    return data_loader

Create a DataLoader

Exercise: Create a DataLoader celeba_train_loader with appropriate hyperparameters.

Call the above function and create a dataloader to view images.

  • You can decide on any reasonable batch_size parameter
  • Your image_size must be 32. Resizing the data to a smaller size will make for faster training, while still creating convincing images of faces!
In [5]:
# Define function hyperparameters
batch_size = 32
img_size = 32

"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
# Call your function and get a dataloader
celeba_train_loader = get_dataloader(batch_size, img_size)

Next, you can view some images! You should seen square images of somewhat-centered faces.

Note: You'll need to convert the Tensor images into a NumPy type and transpose the dimensions to correctly display an image, suggested imshow code is below, but it may not be perfect.

In [6]:
# helper display function
def imshow(img):
    npimg = img.numpy()
    plt.imshow(np.transpose(npimg, (1, 2, 0)))

"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
# obtain one batch of training images
dataiter = iter(celeba_train_loader)
images, _ = dataiter.next() # _ for no labels

# plot the images in the batch, along with the corresponding labels
fig = plt.figure(figsize=(20, 4))
plot_size=20
for idx in np.arange(plot_size):
    ax = fig.add_subplot(2, plot_size/2, idx+1, xticks=[], yticks=[])
    imshow(images[idx])

Exercise: Pre-process your image data and scale it to a pixel range of -1 to 1

You need to do a bit of pre-processing; you know that the output of a tanh activated generator will contain pixel values in a range from -1 to 1, and so, we need to rescale our training images to a range of -1 to 1. (Right now, they are in a range from 0-1.)

In [7]:
# TODO: Complete the scale function
def scale(x, feature_range=(-1, 1)):
    ''' Scale takes in an image x and returns that image, scaled
       with a feature_range of pixel values from -1 to 1. 
       This function assumes that the input x is already scaled from 0-1.'''
    # assume x is scaled to (0, 1)
    # scale to feature_range and return scaled x
    
    # Get the minimum and maximum values of the specified feature_range
    x_min, x_max = feature_range
    
    # Get the range of the specified feature_range
    x_range = x_max - x_min
    
    # Scale a given image x with the specified feature_range
    x = (x * x_range) + x_min
    
    return x
In [8]:
"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
# check scaled range
# should be close to -1 to 1
img = images[0]
scaled_img = scale(img)

print('Min: ', scaled_img.min())
print('Max: ', scaled_img.max())
Min:  tensor(-1.)
Max:  tensor(0.9922)

Define the Model

A GAN is comprised of two adversarial networks, a discriminator and a generator.

Discriminator

Your first task will be to define the discriminator. This is a convolutional classifier like you've built before, only without any maxpooling layers. To deal with this complex data, it's suggested you use a deep network with normalization. You are also allowed to create any helper functions that may be useful.

Exercise: Complete the Discriminator class

  • The inputs to the discriminator are 32x32x3 tensor images
  • The output should be a single value that will indicate whether a given image is real or fake
In [9]:
import torch.nn as nn
import torch.nn.functional as F
In [10]:
# helper conv function
def conv(in_channels, out_channels, kernel_size, stride=2, padding=1, batch_norm=True):
    """Creates a convolutional layer, with optional batch normalization.
    """
    
    layers = []
    conv_layer = nn.Conv2d(in_channels=in_channels, out_channels=out_channels, 
                           kernel_size=kernel_size, stride=stride, padding=padding, bias=False)
    
    layers.append(conv_layer)

    if batch_norm:
        layers.append(nn.BatchNorm2d(out_channels))
    return nn.Sequential(*layers)
In [11]:
class Discriminator(nn.Module):

    def __init__(self, conv_dim):
        """
        Initialize the Discriminator Module
        :param conv_dim: The depth of the first convolutional layer
        """
        super(Discriminator, self).__init__()

        # complete init function
        
        # complete init function
        self.conv_dim = conv_dim

        # 32x32 input
        self.conv1 = conv(3, conv_dim, 4, batch_norm=False) # first layer, no batch_norm
        # 16x16 out
        self.conv2 = conv(conv_dim, conv_dim*2, 4)
        # 8x8 out
        self.conv3 = conv(conv_dim*2, conv_dim*4, 4)
        # 4x4 out
        self.conv4 = conv(conv_dim*4, conv_dim*8, 4)
        # 2x2 out
        
        # final, fully-connected layer
        self.fc = nn.Linear(conv_dim*8*2*2, 1)
        

    def forward(self, x):
        """
        Forward propagation of the neural network
        :param x: The input to the neural network     
        :return: Discriminator logits; the output of the neural network
        """
        # define feedforward behavior
        
        out = F.leaky_relu(self.conv1(x), 0.2)
        out = F.leaky_relu(self.conv2(out), 0.2)
        out = F.leaky_relu(self.conv3(out), 0.2)
        out = F.leaky_relu(self.conv4(out), 0.2)
        
        # flatten
        out = out.view(-1, self.conv_dim*8*2*2)
        
        # final output layer
        out = self.fc(out)        
        return out
        

"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_discriminator(Discriminator)
Tests Passed

Generator

The generator should upsample an input and generate a new image of the same size as our training data 32x32x3. This should be mostly transpose convolutional layers with normalization applied to the outputs.

Exercise: Complete the Generator class

  • The inputs to the generator are vectors of some length z_size
  • The output should be a image of shape 32x32x3
In [13]:
# helper deconv function
def deconv(in_channels, out_channels, kernel_size, stride=2, padding=1, batch_norm=True):
    """Creates a transpose convolutional layer, with optional batch normalization.
    """
    layers = []
    # append transpose conv layer
    layers.append(nn.ConvTranspose2d(in_channels, out_channels, kernel_size, stride, padding, bias=False))
    # optional batch norm layer
    if batch_norm:
        layers.append(nn.BatchNorm2d(out_channels))
    return nn.Sequential(*layers)
In [14]:
class Generator(nn.Module):
    
    def __init__(self, z_size, conv_dim):
        """
        Initialize the Generator Module
        :param z_size: The length of the input latent vector, z
        :param conv_dim: The depth of the inputs to the *last* transpose convolutional layer
        """
        super(Generator, self).__init__()
        
        # complete init function
        
        self.conv_dim = conv_dim
        
        # first, fully-connected layer
        self.fc = nn.Linear(z_size, conv_dim*8*2*2)

        # transpose conv layers
        self.t_conv1 = deconv(conv_dim*8, conv_dim*4, 4)
        self.t_conv2 = deconv(conv_dim*4, conv_dim*2, 4)
        self.t_conv3 = deconv(conv_dim*2, conv_dim, 4)
        self.t_conv4 = deconv(conv_dim, 3, 4, batch_norm=False)
        

    def forward(self, x):
        """
        Forward propagation of the neural network
        :param x: The input to the neural network     
        :return: A 32x32x3 Tensor image as output
        """
        # define feedforward behavior
        
        # fully-connected + reshape 
        out = self.fc(x)
        out = out.view(-1, self.conv_dim*8, 2, 2) # (batch_size, depth, 2, 2)
        
        # hidden transpose conv layers + relu
        out = F.relu(self.t_conv1(out))
        out = F.relu(self.t_conv2(out))
        out = F.relu(self.t_conv3(out))
        
        # last layer + tanh activation
        out = self.t_conv4(out)
        out = F.tanh(out)
        
        return out

"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
tests.test_generator(Generator)
Tests Passed

Initialize the weights of your networks

To help your models converge, you should initialize the weights of the convolutional and linear layers in your model. From reading the original DCGAN paper, they say:

All weights were initialized from a zero-centered Normal distribution with standard deviation 0.02.

So, your next task will be to define a weight initialization function that does just this!

You can refer back to the lesson on weight initialization or even consult existing model code, such as that from the networks.py file in CycleGAN Github repository to help you complete this function.

Exercise: Complete the weight initialization function

  • This should initialize only convolutional and linear layers
  • Initialize the weights to a normal distribution, centered around 0, with a standard deviation of 0.02.
  • The bias terms, if they exist, may be left alone or set to 0.
In [15]:
from torch.nn import init
In [17]:
def weights_init_normal(m):
    """
    Applies initial weights to certain layers in a model .
    The weights are taken from a normal distribution 
    with mean = 0, std dev = 0.02.
    :param m: A module or layer in a network    
    """
    # classname will be something like:
    # `Conv`, `BatchNorm2d`, `Linear`, etc.
    classname = m.__class__.__name__
    
    # TODO: Apply initial weights to convolutional and linear layers
    
    # From https://github.com/junyanz/pytorch-CycleGAN-and-pix2pix/blob/master/models/networks.py
    
    if hasattr(m, 'weight') and (classname.find('Conv') != -1 or classname.find('Linear') != -1):
        init_gain=0.02
        init.normal_(m.weight.data, 0.0, init_gain)
        
        if hasattr(m, 'bias') and m.bias is not None:
            init.constant_(m.bias.data, 0.0)
    
    

Build complete network

Define your models' hyperparameters and instantiate the discriminator and generator from the classes defined above. Make sure you've passed in the correct input arguments.

In [18]:
"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
def build_network(d_conv_dim, g_conv_dim, z_size):
    # define discriminator and generator
    D = Discriminator(d_conv_dim)
    G = Generator(z_size=z_size, conv_dim=g_conv_dim)

    # initialize model weights
    D.apply(weights_init_normal)
    G.apply(weights_init_normal)

    print(D)
    print()
    print(G)
    
    return D, G

Exercise: Define model hyperparameters

In [19]:
# Define model hyperparams
d_conv_dim = 64
g_conv_dim = 64
z_size = 100

"""
DON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE
"""
D, G = build_network(d_conv_dim, g_conv_dim, z_size)
Discriminator(
  (conv1): Sequential(
    (0): Conv2d(3, 64, kernel_size=(4, 4), stride=(2, 2), padding=(1, 1), bias=False)
  )
  (conv2): Sequential(
    (0): Conv2d(64, 128, kernel_size=(4, 4), stride=(2, 2), padding=(1, 1), bias=False)
    (1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
  )
  (conv3): Sequential(
    (0): Conv2d(128, 256, kernel_size=(4, 4), stride=(2, 2), padding=(1, 1), bias=False)
    (1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
  )
  (conv4): Sequential(
    (0): Conv2d(256, 512, kernel_size=(4, 4), stride=(2, 2), padding=(1, 1), bias=False)
    (1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
  )
  (fc): Linear(in_features=2048, out_features=1, bias=True)
)

Generator(
  (fc): Linear(in_features=100, out_features=2048, bias=True)
  (t_conv1): Sequential(
    (0): ConvTranspose2d(512, 256, kernel_size=(4, 4), stride=(2, 2), padding=(1, 1), bias=False)
    (1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
  )
  (t_conv2): Sequential(
    (0): ConvTranspose2d(256, 128, kernel_size=(4, 4), stride=(2, 2), padding=(1, 1), bias=False)
    (1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
  )
  (t_conv3): Sequential(
    (0): ConvTranspose2d(128, 64, kernel_size=(4, 4), stride=(2, 2), padding=(1, 1), bias=False)
    (1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
  )
  (t_conv4): Sequential(
    (0): ConvTranspose2d(64, 3, kernel_size=(4, 4), stride=(2, 2), padding=(1, 1), bias=False)
  )
)

Training on GPU

Check if you can train on GPU. Here, we'll set this as a boolean variable train_on_gpu. Later, you'll be responsible for making sure that

  • Models,
  • Model inputs, and
  • Loss function arguments

Are moved to GPU, where appropriate.

In [20]:
"""
DON'T MODIFY ANYTHING IN THIS CELL
"""
import torch

# Check for a GPU
train_on_gpu = torch.cuda.is_available()
if not train_on_gpu:
    print('No GPU found. Please use a GPU to train your neural network.')
else:
    print('Training on GPU!')
Training on GPU!

Discriminator and Generator Losses

Now we need to calculate the losses for both types of adversarial networks.

Discriminator Losses

  • For the discriminator, the total loss is the sum of the losses for real and fake images, d_loss = d_real_loss + d_fake_loss.
  • Remember that we want the discriminator to output 1 for real images and 0 for fake images, so we need to set up the losses to reflect that.

Generator Loss

The generator loss will look similar only with flipped labels. The generator's goal is to get the discriminator to think its generated images are real.

Exercise: Complete real and fake loss functions

You may choose to use either cross entropy or a least squares error loss to complete the following real_loss and fake_loss functions.

In [21]:
def real_loss(D_out):
    '''Calculates how close discriminator outputs are to being real.
       param, D_out: discriminator logits
       return: real loss'''
    
    batch_size = D_out.size(0)
    
    # smooth the real label to help regularize the model, 
    # Make real labels = 0.9
    labels = torch.ones(batch_size)*0.9
    
    # move labels to GPU if available     
    if train_on_gpu:
        labels = labels.cuda()
        
    # binary cross entropy with logits loss
    criterion = nn.BCEWithLogitsLoss()
    # calculate loss
    loss = criterion(D_out.squeeze(), labels)
    
    return loss

def fake_loss(D_out):
    '''Calculates how close discriminator outputs are to being fake.
       param, D_out: discriminator logits
       return: fake loss'''
    
    batch_size = D_out.size(0)
    labels = torch.zeros(batch_size) # fake labels = 0
    if train_on_gpu:
        labels = labels.cuda()
    criterion = nn.BCEWithLogitsLoss()
    # calculate loss
    loss = criterion(D_out.squeeze(), labels)
    
    return loss

Optimizers

Exercise: Define optimizers for your Discriminator (D) and Generator (G)

Define optimizers for your models with appropriate hyperparameters.

In [22]:
import torch.optim as optim

# This is a suggested hyperparameters in the paper https://arxiv.org/pdf/1511.06434.pdf

# params
lr = 0.0002
beta1=0.5
beta2=0.999 # default value

# Create optimizers for the discriminator D and generator G
d_optimizer = optim.Adam(D.parameters(), lr, [beta1, beta2])
g_optimizer = optim.Adam(G.parameters(), lr, [beta1, beta2])

Training

Training will involve alternating between training the discriminator and the generator. You'll use your functions real_loss and fake_loss to help you calculate the discriminator losses.

  • You should train the discriminator by alternating on real and fake images
  • Then the generator, which tries to trick the discriminator and should have an opposing loss function

Saving Samples

You've been given some code to print out some loss statistics and save some generated "fake" samples.

Exercise: Complete the training function

Keep in mind that, if you've moved your models to GPU, you'll also have to move any model inputs to GPU.

In [23]:
def train(D, G, n_epochs, print_every=50):
    '''Trains adversarial networks for some number of epochs
       param, D: the discriminator network
       param, G: the generator network
       param, n_epochs: number of epochs to train for
       param, print_every: when to print and record the models' losses
       return: D and G losses'''
    
    # move models to GPU
    if train_on_gpu:
        D.cuda()
        G.cuda()

    # keep track of loss and generated, "fake" samples
    samples = []
    losses = []

    # Get some fixed data for sampling. These are images that are held
    # constant throughout training, and allow us to inspect the model's performance
    sample_size=16
    fixed_z = np.random.uniform(-1, 1, size=(sample_size, z_size))
    fixed_z = torch.from_numpy(fixed_z).float()
    # move z to GPU if available
    if train_on_gpu:
        fixed_z = fixed_z.cuda()

    # epoch training loop
    for epoch in range(n_epochs):

        # batch training loop
        for batch_i, (real_images, _) in enumerate(celeba_train_loader):

            batch_size = real_images.size(0)
            real_images = scale(real_images)

            # ===============================================
            #         YOUR CODE HERE: TRAIN THE NETWORKS
            # ===============================================
            
            # ============================================
            #            TRAIN THE DISCRIMINATOR
            # ============================================
            
            # 1. Train the discriminator on real and fake images
            d_optimizer.zero_grad()
            
            # --- Compute the discriminator losses on real images 
            if train_on_gpu:
                real_images = real_images.cuda()
                
            D_real = D(real_images)
            d_real_loss = real_loss(D_real)
            
            # --- Train with fake images
        
            # Generate fake images
            z = np.random.uniform(-1, 1, size=(batch_size, z_size))
            z = torch.from_numpy(z).float()
            # move x to GPU, if available
            if train_on_gpu:
                z = z.cuda()
            fake_images = G(z)

            # --- Compute the discriminator losses on fake images     
            D_fake = D(fake_images)
            d_fake_loss = fake_loss(D_fake)

            # --- add up loss and perform backprop
            d_loss = d_real_loss + d_fake_loss
            
            d_loss.backward()
            d_optimizer.step()
            
            # =========================================
            #            TRAIN THE GENERATOR
            # =========================================
            
            g_optimizer.zero_grad()
            
            # --- Train with fake images and flipped labels
        
            # Generate fake images
            z = np.random.uniform(-1, 1, size=(batch_size, z_size))
            z = torch.from_numpy(z).float()
            if train_on_gpu:
                z = z.cuda()
            fake_images = G(z)
            
            # --- Compute the discriminator losses on fake images 
            # using flipped labels!
            
            # 2. Train the generator with an adversarial loss
            
            D_fake = D(fake_images)
            g_loss = real_loss(D_fake) # use real loss to flip labels

            # perform backprop
            g_loss.backward()
            g_optimizer.step()
            
            # ===============================================
            #              END OF YOUR CODE
            # ===============================================

            # Print some loss stats
            if batch_i % print_every == 0:
                # append discriminator loss and generator loss
                losses.append((d_loss.item(), g_loss.item()))
                # print discriminator and generator loss
                print('Epoch [{:5d}/{:5d}] | d_loss: {:6.4f} | g_loss: {:6.4f}'.format(
                        epoch+1, n_epochs, d_loss.item(), g_loss.item()))


        ## AFTER EACH EPOCH##    
        # this code assumes your generator is named G, feel free to change the name
        # generate and save sample, fake images
        G.eval() # for generating samples
        samples_z = G(fixed_z)
        samples.append(samples_z)
        G.train() # back to training mode

    # Save training generator samples
    with open('train_samples.pkl', 'wb') as f:
        pkl.dump(samples, f)
    
    # finally return losses
    return losses

Set your number of training epochs and train your GAN!

In [24]:
from workspace_utils import active_session
In [25]:
# set number of epochs 
n_epochs = 50


"""
DON'T MODIFY ANYTHING IN THIS CELL
"""

# training the model
with active_session():
    # call training function
    losses = train(D, G, n_epochs=n_epochs)
Epoch [    1/   50] | d_loss: 1.3581 | g_loss: 1.2980
Epoch [    1/   50] | d_loss: 0.4002 | g_loss: 3.5169
Epoch [    1/   50] | d_loss: 0.4314 | g_loss: 3.4175
Epoch [    1/   50] | d_loss: 0.6915 | g_loss: 3.0819
Epoch [    1/   50] | d_loss: 0.8721 | g_loss: 2.2154
Epoch [    1/   50] | d_loss: 0.7829 | g_loss: 2.9981
Epoch [    1/   50] | d_loss: 0.7697 | g_loss: 2.5357
Epoch [    1/   50] | d_loss: 0.8338 | g_loss: 3.5991
Epoch [    1/   50] | d_loss: 1.0180 | g_loss: 1.7113
Epoch [    1/   50] | d_loss: 0.9523 | g_loss: 2.9295
Epoch [    1/   50] | d_loss: 1.0322 | g_loss: 3.9503
Epoch [    1/   50] | d_loss: 0.8677 | g_loss: 2.3795
Epoch [    1/   50] | d_loss: 0.8144 | g_loss: 3.7865
Epoch [    1/   50] | d_loss: 0.5823 | g_loss: 2.9120
Epoch [    1/   50] | d_loss: 0.7702 | g_loss: 3.0675
Epoch [    1/   50] | d_loss: 0.7159 | g_loss: 3.0474
Epoch [    1/   50] | d_loss: 0.7241 | g_loss: 2.0344
Epoch [    1/   50] | d_loss: 0.7527 | g_loss: 2.3497
Epoch [    1/   50] | d_loss: 0.8318 | g_loss: 2.3491
Epoch [    1/   50] | d_loss: 0.9042 | g_loss: 2.9018
Epoch [    1/   50] | d_loss: 0.9045 | g_loss: 2.2915
Epoch [    1/   50] | d_loss: 0.9573 | g_loss: 2.2728
Epoch [    1/   50] | d_loss: 0.9204 | g_loss: 2.1881
Epoch [    1/   50] | d_loss: 0.8316 | g_loss: 2.5369
Epoch [    1/   50] | d_loss: 0.7548 | g_loss: 2.3403
Epoch [    1/   50] | d_loss: 0.9075 | g_loss: 3.2602
Epoch [    1/   50] | d_loss: 1.2029 | g_loss: 2.7193
Epoch [    1/   50] | d_loss: 0.9719 | g_loss: 1.6801
Epoch [    1/   50] | d_loss: 0.6058 | g_loss: 2.4693
Epoch [    1/   50] | d_loss: 1.1413 | g_loss: 2.8077
Epoch [    1/   50] | d_loss: 0.9134 | g_loss: 1.8467
Epoch [    1/   50] | d_loss: 1.0791 | g_loss: 1.8396
Epoch [    1/   50] | d_loss: 1.1226 | g_loss: 0.9625
Epoch [    1/   50] | d_loss: 1.0916 | g_loss: 1.2722
Epoch [    1/   50] | d_loss: 0.9226 | g_loss: 1.4757
Epoch [    1/   50] | d_loss: 0.8973 | g_loss: 2.5254
Epoch [    2/   50] | d_loss: 0.8139 | g_loss: 1.8797
Epoch [    2/   50] | d_loss: 1.0136 | g_loss: 2.3344
Epoch [    2/   50] | d_loss: 0.6923 | g_loss: 1.9555
Epoch [    2/   50] | d_loss: 0.7750 | g_loss: 2.7211
Epoch [    2/   50] | d_loss: 0.8686 | g_loss: 1.7232
Epoch [    2/   50] | d_loss: 0.9361 | g_loss: 2.9418
Epoch [    2/   50] | d_loss: 0.8611 | g_loss: 2.0122
Epoch [    2/   50] | d_loss: 0.7596 | g_loss: 3.3578
Epoch [    2/   50] | d_loss: 1.0495 | g_loss: 1.9602
Epoch [    2/   50] | d_loss: 0.9311 | g_loss: 2.1721
Epoch [    2/   50] | d_loss: 0.9485 | g_loss: 2.1287
Epoch [    2/   50] | d_loss: 0.8577 | g_loss: 1.9430
Epoch [    2/   50] | d_loss: 1.4954 | g_loss: 0.7540
Epoch [    2/   50] | d_loss: 0.9842 | g_loss: 2.4079
Epoch [    2/   50] | d_loss: 0.8108 | g_loss: 2.4274
Epoch [    2/   50] | d_loss: 1.1949 | g_loss: 2.1008
Epoch [    2/   50] | d_loss: 1.0042 | g_loss: 1.5085
Epoch [    2/   50] | d_loss: 0.9443 | g_loss: 1.6472
Epoch [    2/   50] | d_loss: 1.0419 | g_loss: 1.6434
Epoch [    2/   50] | d_loss: 0.7109 | g_loss: 1.6586
Epoch [    2/   50] | d_loss: 0.7371 | g_loss: 1.2959
Epoch [    2/   50] | d_loss: 0.8381 | g_loss: 2.8046
Epoch [    2/   50] | d_loss: 0.9597 | g_loss: 1.8769
Epoch [    2/   50] | d_loss: 0.7972 | g_loss: 1.7351
Epoch [    2/   50] | d_loss: 1.1927 | g_loss: 1.4220
Epoch [    2/   50] | d_loss: 0.9932 | g_loss: 1.3348
Epoch [    2/   50] | d_loss: 0.8431 | g_loss: 2.5497
Epoch [    2/   50] | d_loss: 0.7337 | g_loss: 1.7698
Epoch [    2/   50] | d_loss: 0.8595 | g_loss: 2.1925
Epoch [    2/   50] | d_loss: 0.7942 | g_loss: 1.6270
Epoch [    2/   50] | d_loss: 0.8008 | g_loss: 1.1115
Epoch [    2/   50] | d_loss: 0.7352 | g_loss: 1.8568
Epoch [    2/   50] | d_loss: 1.0708 | g_loss: 2.4554
Epoch [    2/   50] | d_loss: 0.8617 | g_loss: 2.3744
Epoch [    2/   50] | d_loss: 0.7739 | g_loss: 1.6761
Epoch [    2/   50] | d_loss: 0.6948 | g_loss: 1.4688
Epoch [    3/   50] | d_loss: 0.7075 | g_loss: 2.6227
Epoch [    3/   50] | d_loss: 1.0174 | g_loss: 1.0021
Epoch [    3/   50] | d_loss: 0.8902 | g_loss: 2.1436
Epoch [    3/   50] | d_loss: 0.8927 | g_loss: 1.6651
Epoch [    3/   50] | d_loss: 1.0851 | g_loss: 1.5250
Epoch [    3/   50] | d_loss: 0.8584 | g_loss: 3.1472
Epoch [    3/   50] | d_loss: 0.8246 | g_loss: 2.3337
Epoch [    3/   50] | d_loss: 0.8622 | g_loss: 1.4721
Epoch [    3/   50] | d_loss: 0.8527 | g_loss: 2.2524
Epoch [    3/   50] | d_loss: 1.0259 | g_loss: 3.5732
Epoch [    3/   50] | d_loss: 1.1226 | g_loss: 1.0266
Epoch [    3/   50] | d_loss: 0.9606 | g_loss: 2.6117
Epoch [    3/   50] | d_loss: 0.8795 | g_loss: 1.7401
Epoch [    3/   50] | d_loss: 0.9525 | g_loss: 1.4725
Epoch [    3/   50] | d_loss: 1.0886 | g_loss: 3.0681
Epoch [    3/   50] | d_loss: 0.9749 | g_loss: 0.6884
Epoch [    3/   50] | d_loss: 1.2536 | g_loss: 1.3674
Epoch [    3/   50] | d_loss: 0.5963 | g_loss: 3.0908
Epoch [    3/   50] | d_loss: 0.9153 | g_loss: 2.8213
Epoch [    3/   50] | d_loss: 0.8824 | g_loss: 1.9754
Epoch [    3/   50] | d_loss: 0.9543 | g_loss: 2.1531
Epoch [    3/   50] | d_loss: 1.0494 | g_loss: 1.5247
Epoch [    3/   50] | d_loss: 1.2324 | g_loss: 1.5910
Epoch [    3/   50] | d_loss: 0.8365 | g_loss: 1.5188
Epoch [    3/   50] | d_loss: 0.9347 | g_loss: 2.0056
Epoch [    3/   50] | d_loss: 1.1838 | g_loss: 2.0428
Epoch [    3/   50] | d_loss: 1.2483 | g_loss: 0.8724
Epoch [    3/   50] | d_loss: 1.4463 | g_loss: 1.4535
Epoch [    3/   50] | d_loss: 1.1461 | g_loss: 1.7976
Epoch [    3/   50] | d_loss: 1.0898 | g_loss: 0.4688
Epoch [    3/   50] | d_loss: 1.1225 | g_loss: 1.5840
Epoch [    3/   50] | d_loss: 0.6722 | g_loss: 1.1767
Epoch [    3/   50] | d_loss: 0.9999 | g_loss: 1.9175
Epoch [    3/   50] | d_loss: 0.8615 | g_loss: 2.0534
Epoch [    3/   50] | d_loss: 0.7027 | g_loss: 2.7537
Epoch [    3/   50] | d_loss: 0.7347 | g_loss: 1.2763
Epoch [    4/   50] | d_loss: 0.8020 | g_loss: 2.6772
Epoch [    4/   50] | d_loss: 0.6657 | g_loss: 1.2311
Epoch [    4/   50] | d_loss: 0.8742 | g_loss: 1.4216
Epoch [    4/   50] | d_loss: 1.4249 | g_loss: 4.2036
Epoch [    4/   50] | d_loss: 0.9105 | g_loss: 1.3941
Epoch [    4/   50] | d_loss: 0.7843 | g_loss: 2.1011
Epoch [    4/   50] | d_loss: 0.7283 | g_loss: 1.8012
Epoch [    4/   50] | d_loss: 0.9897 | g_loss: 1.2213
Epoch [    4/   50] | d_loss: 1.0103 | g_loss: 1.9611
Epoch [    4/   50] | d_loss: 1.0668 | g_loss: 2.1265
Epoch [    4/   50] | d_loss: 1.0160 | g_loss: 0.9706
Epoch [    4/   50] | d_loss: 1.0131 | g_loss: 1.8371
Epoch [    4/   50] | d_loss: 1.2370 | g_loss: 2.4193
Epoch [    4/   50] | d_loss: 0.6572 | g_loss: 1.8091
Epoch [    4/   50] | d_loss: 0.8070 | g_loss: 1.9230
Epoch [    4/   50] | d_loss: 1.0887 | g_loss: 2.0875
Epoch [    4/   50] | d_loss: 0.7122 | g_loss: 2.0077
Epoch [    4/   50] | d_loss: 0.9828 | g_loss: 1.2439
Epoch [    4/   50] | d_loss: 1.0392 | g_loss: 1.1016
Epoch [    4/   50] | d_loss: 0.8736 | g_loss: 2.1769
Epoch [    4/   50] | d_loss: 0.9831 | g_loss: 1.5275
Epoch [    4/   50] | d_loss: 0.9105 | g_loss: 1.0906
Epoch [    4/   50] | d_loss: 1.1252 | g_loss: 1.2217
Epoch [    4/   50] | d_loss: 0.9779 | g_loss: 1.7552
Epoch [    4/   50] | d_loss: 0.9783 | g_loss: 1.3484
Epoch [    4/   50] | d_loss: 0.7492 | g_loss: 2.1470
Epoch [    4/   50] | d_loss: 0.7956 | g_loss: 2.1949
Epoch [    4/   50] | d_loss: 0.8426 | g_loss: 2.7018
Epoch [    4/   50] | d_loss: 1.0725 | g_loss: 1.8023
Epoch [    4/   50] | d_loss: 0.8173 | g_loss: 1.4435
Epoch [    4/   50] | d_loss: 1.0030 | g_loss: 1.3202
Epoch [    4/   50] | d_loss: 0.9676 | g_loss: 1.7165
Epoch [    4/   50] | d_loss: 0.9213 | g_loss: 2.3084
Epoch [    4/   50] | d_loss: 0.8016 | g_loss: 1.6903
Epoch [    4/   50] | d_loss: 1.1377 | g_loss: 1.8575
Epoch [    4/   50] | d_loss: 0.9646 | g_loss: 1.4421
Epoch [    5/   50] | d_loss: 1.1062 | g_loss: 1.5091
Epoch [    5/   50] | d_loss: 0.9632 | g_loss: 1.1478
Epoch [    5/   50] | d_loss: 0.6888 | g_loss: 2.4263
Epoch [    5/   50] | d_loss: 0.7627 | g_loss: 1.0121
Epoch [    5/   50] | d_loss: 1.0633 | g_loss: 1.7369
Epoch [    5/   50] | d_loss: 0.6866 | g_loss: 2.4630
Epoch [    5/   50] | d_loss: 1.1910 | g_loss: 1.4443
Epoch [    5/   50] | d_loss: 0.9470 | g_loss: 1.9102
Epoch [    5/   50] | d_loss: 0.9311 | g_loss: 1.9032
Epoch [    5/   50] | d_loss: 0.6259 | g_loss: 2.1349
Epoch [    5/   50] | d_loss: 1.0779 | g_loss: 0.8445
Epoch [    5/   50] | d_loss: 1.0010 | g_loss: 1.2795
Epoch [    5/   50] | d_loss: 1.0077 | g_loss: 1.1870
Epoch [    5/   50] | d_loss: 1.2192 | g_loss: 1.8703
Epoch [    5/   50] | d_loss: 1.0194 | g_loss: 2.1055
Epoch [    5/   50] | d_loss: 0.8158 | g_loss: 1.4600
Epoch [    5/   50] | d_loss: 1.2384 | g_loss: 1.8952
Epoch [    5/   50] | d_loss: 1.4092 | g_loss: 1.4448
Epoch [    5/   50] | d_loss: 1.3519 | g_loss: 2.0111
Epoch [    5/   50] | d_loss: 0.7686 | g_loss: 2.6738
Epoch [    5/   50] | d_loss: 1.0443 | g_loss: 1.0156
Epoch [    5/   50] | d_loss: 0.8588 | g_loss: 1.9115
Epoch [    5/   50] | d_loss: 0.9515 | g_loss: 1.5589
Epoch [    5/   50] | d_loss: 0.9283 | g_loss: 3.1405
Epoch [    5/   50] | d_loss: 0.9651 | g_loss: 1.3271
Epoch [    5/   50] | d_loss: 1.1097 | g_loss: 2.8688
Epoch [    5/   50] | d_loss: 1.0155 | g_loss: 1.1547
Epoch [    5/   50] | d_loss: 0.7403 | g_loss: 1.5195
Epoch [    5/   50] | d_loss: 1.2147 | g_loss: 3.2366
Epoch [    5/   50] | d_loss: 1.0312 | g_loss: 1.7002
Epoch [    5/   50] | d_loss: 0.9220 | g_loss: 1.1727
Epoch [    5/   50] | d_loss: 1.2399 | g_loss: 1.2883
Epoch [    5/   50] | d_loss: 1.0591 | g_loss: 0.7343
Epoch [    5/   50] | d_loss: 1.2726 | g_loss: 2.1417
Epoch [    5/   50] | d_loss: 0.9042 | g_loss: 1.5803
Epoch [    5/   50] | d_loss: 1.3305 | g_loss: 1.3127
Epoch [    6/   50] | d_loss: 1.7168 | g_loss: 3.6851
Epoch [    6/   50] | d_loss: 0.9869 | g_loss: 2.2323
Epoch [    6/   50] | d_loss: 0.7256 | g_loss: 2.4726
Epoch [    6/   50] | d_loss: 1.0393 | g_loss: 2.5778
Epoch [    6/   50] | d_loss: 1.7638 | g_loss: 1.5230
Epoch [    6/   50] | d_loss: 0.8342 | g_loss: 1.3914
Epoch [    6/   50] | d_loss: 0.7373 | g_loss: 1.9250
Epoch [    6/   50] | d_loss: 0.9344 | g_loss: 2.1649
Epoch [    6/   50] | d_loss: 1.3332 | g_loss: 1.0809
Epoch [    6/   50] | d_loss: 1.1250 | g_loss: 2.6522
Epoch [    6/   50] | d_loss: 1.2370 | g_loss: 1.5861
Epoch [    6/   50] | d_loss: 0.7309 | g_loss: 1.6383
Epoch [    6/   50] | d_loss: 1.2274 | g_loss: 1.8296
Epoch [    6/   50] | d_loss: 0.7224 | g_loss: 2.4553
Epoch [    6/   50] | d_loss: 0.6822 | g_loss: 2.1514
Epoch [    6/   50] | d_loss: 0.9714 | g_loss: 1.9961
Epoch [    6/   50] | d_loss: 0.5885 | g_loss: 2.5952
Epoch [    6/   50] | d_loss: 1.0406 | g_loss: 1.3717
Epoch [    6/   50] | d_loss: 1.1831 | g_loss: 2.2194
Epoch [    6/   50] | d_loss: 0.9310 | g_loss: 1.6660
Epoch [    6/   50] | d_loss: 0.8041 | g_loss: 1.9965
Epoch [    6/   50] | d_loss: 0.7356 | g_loss: 1.4012
Epoch [    6/   50] | d_loss: 0.8213 | g_loss: 2.0033
Epoch [    6/   50] | d_loss: 1.4349 | g_loss: 2.4077
Epoch [    6/   50] | d_loss: 1.3157 | g_loss: 1.4396
Epoch [    6/   50] | d_loss: 0.9626 | g_loss: 1.9448
Epoch [    6/   50] | d_loss: 1.2265 | g_loss: 1.8825
Epoch [    6/   50] | d_loss: 0.8935 | g_loss: 1.2920
Epoch [    6/   50] | d_loss: 0.9194 | g_loss: 1.9947
Epoch [    6/   50] | d_loss: 0.9393 | g_loss: 2.2446
Epoch [    6/   50] | d_loss: 1.1641 | g_loss: 1.5105
Epoch [    6/   50] | d_loss: 0.6467 | g_loss: 1.2754
Epoch [    6/   50] | d_loss: 0.9066 | g_loss: 1.0526
Epoch [    6/   50] | d_loss: 0.9854 | g_loss: 1.4633
Epoch [    6/   50] | d_loss: 0.8267 | g_loss: 1.8486
Epoch [    6/   50] | d_loss: 1.0144 | g_loss: 1.4392
Epoch [    7/   50] | d_loss: 1.2550 | g_loss: 1.2638
Epoch [    7/   50] | d_loss: 0.9509 | g_loss: 1.1674
Epoch [    7/   50] | d_loss: 0.6167 | g_loss: 2.8441
Epoch [    7/   50] | d_loss: 0.6839 | g_loss: 2.8066
Epoch [    7/   50] | d_loss: 0.8295 | g_loss: 1.9460
Epoch [    7/   50] | d_loss: 0.7948 | g_loss: 1.8021
Epoch [    7/   50] | d_loss: 0.8457 | g_loss: 1.6137
Epoch [    7/   50] | d_loss: 0.9133 | g_loss: 1.7618
Epoch [    7/   50] | d_loss: 0.6618 | g_loss: 1.7512
Epoch [    7/   50] | d_loss: 0.7116 | g_loss: 3.0238
Epoch [    7/   50] | d_loss: 1.2834 | g_loss: 1.9196
Epoch [    7/   50] | d_loss: 0.9074 | g_loss: 1.5788
Epoch [    7/   50] | d_loss: 0.6570 | g_loss: 2.3027
Epoch [    7/   50] | d_loss: 1.2678 | g_loss: 1.8804
Epoch [    7/   50] | d_loss: 0.7640 | g_loss: 2.1184
Epoch [    7/   50] | d_loss: 0.7253 | g_loss: 1.9821
Epoch [    7/   50] | d_loss: 1.1177 | g_loss: 2.3510
Epoch [    7/   50] | d_loss: 0.6871 | g_loss: 1.3287
Epoch [    7/   50] | d_loss: 0.7373 | g_loss: 2.1803
Epoch [    7/   50] | d_loss: 1.8395 | g_loss: 2.8376
Epoch [    7/   50] | d_loss: 0.9017 | g_loss: 1.9684
Epoch [    7/   50] | d_loss: 1.0195 | g_loss: 2.3256
Epoch [    7/   50] | d_loss: 0.9239 | g_loss: 1.6135
Epoch [    7/   50] | d_loss: 0.8682 | g_loss: 1.7177
Epoch [    7/   50] | d_loss: 1.0555 | g_loss: 2.1786
Epoch [    7/   50] | d_loss: 1.1017 | g_loss: 1.4895
Epoch [    7/   50] | d_loss: 0.8717 | g_loss: 1.1823
Epoch [    7/   50] | d_loss: 0.7644 | g_loss: 1.6542
Epoch [    7/   50] | d_loss: 1.1199 | g_loss: 2.0965
Epoch [    7/   50] | d_loss: 0.8731 | g_loss: 1.0521
Epoch [    7/   50] | d_loss: 0.8895 | g_loss: 1.1439
Epoch [    7/   50] | d_loss: 0.8343 | g_loss: 1.6280
Epoch [    7/   50] | d_loss: 0.6804 | g_loss: 2.2702
Epoch [    7/   50] | d_loss: 0.8659 | g_loss: 0.8896
Epoch [    7/   50] | d_loss: 0.9614 | g_loss: 0.7918
Epoch [    7/   50] | d_loss: 0.6721 | g_loss: 2.3672
Epoch [    8/   50] | d_loss: 0.8629 | g_loss: 2.8979
Epoch [    8/   50] | d_loss: 0.8461 | g_loss: 1.6884
Epoch [    8/   50] | d_loss: 0.9081 | g_loss: 1.1987
Epoch [    8/   50] | d_loss: 0.7918 | g_loss: 3.1316
Epoch [    8/   50] | d_loss: 1.1217 | g_loss: 1.4278
Epoch [    8/   50] | d_loss: 0.5860 | g_loss: 2.4003
Epoch [    8/   50] | d_loss: 0.7786 | g_loss: 1.0391
Epoch [    8/   50] | d_loss: 1.1120 | g_loss: 2.6940
Epoch [    8/   50] | d_loss: 0.7864 | g_loss: 1.4932
Epoch [    8/   50] | d_loss: 1.2448 | g_loss: 1.5179
Epoch [    8/   50] | d_loss: 0.9395 | g_loss: 2.3392
Epoch [    8/   50] | d_loss: 0.7084 | g_loss: 3.1218
Epoch [    8/   50] | d_loss: 1.1154 | g_loss: 2.1145
Epoch [    8/   50] | d_loss: 1.2839 | g_loss: 2.2304
Epoch [    8/   50] | d_loss: 0.6697 | g_loss: 1.4440
Epoch [    8/   50] | d_loss: 0.8169 | g_loss: 1.0483
Epoch [    8/   50] | d_loss: 0.7743 | g_loss: 1.9871
Epoch [    8/   50] | d_loss: 1.0575 | g_loss: 1.3771
Epoch [    8/   50] | d_loss: 0.8298 | g_loss: 2.8049
Epoch [    8/   50] | d_loss: 0.8836 | g_loss: 0.9569
Epoch [    8/   50] | d_loss: 0.7156 | g_loss: 2.0099
Epoch [    8/   50] | d_loss: 0.7614 | g_loss: 1.4054
Epoch [    8/   50] | d_loss: 1.1273 | g_loss: 1.8102
Epoch [    8/   50] | d_loss: 0.9439 | g_loss: 1.4923
Epoch [    8/   50] | d_loss: 0.7819 | g_loss: 1.6514
Epoch [    8/   50] | d_loss: 0.9622 | g_loss: 2.5810
Epoch [    8/   50] | d_loss: 0.5928 | g_loss: 2.3611
Epoch [    8/   50] | d_loss: 0.8104 | g_loss: 1.7558
Epoch [    8/   50] | d_loss: 0.6735 | g_loss: 2.6792
Epoch [    8/   50] | d_loss: 1.1452 | g_loss: 2.0943
Epoch [    8/   50] | d_loss: 1.0859 | g_loss: 1.4184
Epoch [    8/   50] | d_loss: 0.8714 | g_loss: 1.2884
Epoch [    8/   50] | d_loss: 0.9810 | g_loss: 1.2033
Epoch [    8/   50] | d_loss: 0.6312 | g_loss: 1.5634
Epoch [    8/   50] | d_loss: 0.8359 | g_loss: 1.8054
Epoch [    8/   50] | d_loss: 0.8548 | g_loss: 1.1598
Epoch [    9/   50] | d_loss: 0.7480 | g_loss: 1.6389
Epoch [    9/   50] | d_loss: 0.7678 | g_loss: 0.9630
Epoch [    9/   50] | d_loss: 0.6128 | g_loss: 3.2034
Epoch [    9/   50] | d_loss: 0.8353 | g_loss: 1.5400
Epoch [    9/   50] | d_loss: 0.7710 | g_loss: 2.6649
Epoch [    9/   50] | d_loss: 1.0999 | g_loss: 2.2420
Epoch [    9/   50] | d_loss: 0.8087 | g_loss: 2.5794
Epoch [    9/   50] | d_loss: 0.9067 | g_loss: 1.3504
Epoch [    9/   50] | d_loss: 0.8278 | g_loss: 1.3998
Epoch [    9/   50] | d_loss: 1.0086 | g_loss: 1.2195
Epoch [    9/   50] | d_loss: 1.2592 | g_loss: 1.8717
Epoch [    9/   50] | d_loss: 0.9948 | g_loss: 2.8753
Epoch [    9/   50] | d_loss: 0.8953 | g_loss: 1.5261
Epoch [    9/   50] | d_loss: 1.0241 | g_loss: 1.1068
Epoch [    9/   50] | d_loss: 1.0177 | g_loss: 1.6032
Epoch [    9/   50] | d_loss: 0.7847 | g_loss: 1.4005
Epoch [    9/   50] | d_loss: 1.2093 | g_loss: 1.6591
Epoch [    9/   50] | d_loss: 1.0729 | g_loss: 2.0262
Epoch [    9/   50] | d_loss: 1.5230 | g_loss: 1.4351
Epoch [    9/   50] | d_loss: 0.7884 | g_loss: 2.2861
Epoch [    9/   50] | d_loss: 0.7845 | g_loss: 2.1040
Epoch [    9/   50] | d_loss: 0.8203 | g_loss: 1.7218
Epoch [    9/   50] | d_loss: 1.4931 | g_loss: 1.8177
Epoch [    9/   50] | d_loss: 0.8130 | g_loss: 1.6977
Epoch [    9/   50] | d_loss: 0.5968 | g_loss: 1.8800
Epoch [    9/   50] | d_loss: 0.5847 | g_loss: 2.6710
Epoch [    9/   50] | d_loss: 0.5841 | g_loss: 2.0283
Epoch [    9/   50] | d_loss: 0.8379 | g_loss: 1.1119
Epoch [    9/   50] | d_loss: 0.7306 | g_loss: 3.1314
Epoch [    9/   50] | d_loss: 0.6709 | g_loss: 1.8487
Epoch [    9/   50] | d_loss: 0.9154 | g_loss: 1.6753
Epoch [    9/   50] | d_loss: 0.7868 | g_loss: 1.8776
Epoch [    9/   50] | d_loss: 0.7018 | g_loss: 2.1084
Epoch [    9/   50] | d_loss: 0.9893 | g_loss: 1.0400
Epoch [    9/   50] | d_loss: 0.6670 | g_loss: 2.6363
Epoch [    9/   50] | d_loss: 0.7647 | g_loss: 1.6242
Epoch [   10/   50] | d_loss: 0.8276 | g_loss: 2.7354
Epoch [   10/   50] | d_loss: 0.8139 | g_loss: 1.5241
Epoch [   10/   50] | d_loss: 0.5916 | g_loss: 2.5725
Epoch [   10/   50] | d_loss: 0.6072 | g_loss: 3.4753
Epoch [   10/   50] | d_loss: 0.6336 | g_loss: 3.5885
Epoch [   10/   50] | d_loss: 0.6765 | g_loss: 2.4438
Epoch [   10/   50] | d_loss: 0.5633 | g_loss: 2.1677
Epoch [   10/   50] | d_loss: 0.6466 | g_loss: 2.2780
Epoch [   10/   50] | d_loss: 1.0658 | g_loss: 2.8770
Epoch [   10/   50] | d_loss: 1.1452 | g_loss: 1.6924
Epoch [   10/   50] | d_loss: 0.7464 | g_loss: 1.8596
Epoch [   10/   50] | d_loss: 0.8724 | g_loss: 1.1014
Epoch [   10/   50] | d_loss: 0.6010 | g_loss: 2.3752
Epoch [   10/   50] | d_loss: 0.7156 | g_loss: 1.9031
Epoch [   10/   50] | d_loss: 0.8629 | g_loss: 0.8944
Epoch [   10/   50] | d_loss: 0.7044 | g_loss: 2.2818
Epoch [   10/   50] | d_loss: 0.5133 | g_loss: 2.5950
Epoch [   10/   50] | d_loss: 0.7569 | g_loss: 1.4063
Epoch [   10/   50] | d_loss: 0.7977 | g_loss: 2.6296
Epoch [   10/   50] | d_loss: 0.6655 | g_loss: 1.8118
Epoch [   10/   50] | d_loss: 1.0093 | g_loss: 2.5383
Epoch [   10/   50] | d_loss: 0.6945 | g_loss: 2.1765
Epoch [   10/   50] | d_loss: 0.4966 | g_loss: 2.9173
Epoch [   10/   50] | d_loss: 0.6868 | g_loss: 1.4616
Epoch [   10/   50] | d_loss: 0.6072 | g_loss: 1.7770
Epoch [   10/   50] | d_loss: 0.7854 | g_loss: 1.9123
Epoch [   10/   50] | d_loss: 0.7457 | g_loss: 2.8872
Epoch [   10/   50] | d_loss: 1.0843 | g_loss: 2.4202
Epoch [   10/   50] | d_loss: 1.2803 | g_loss: 1.8233
Epoch [   10/   50] | d_loss: 0.9294 | g_loss: 1.7075
Epoch [   10/   50] | d_loss: 0.9756 | g_loss: 3.1306
Epoch [   10/   50] | d_loss: 0.9973 | g_loss: 3.2502
Epoch [   10/   50] | d_loss: 0.6640 | g_loss: 1.7595
Epoch [   10/   50] | d_loss: 0.6345 | g_loss: 3.3318
Epoch [   10/   50] | d_loss: 0.7208 | g_loss: 1.8641
Epoch [   10/   50] | d_loss: 0.6196 | g_loss: 2.3525
Epoch [   11/   50] | d_loss: 1.1552 | g_loss: 3.1500
Epoch [   11/   50] | d_loss: 0.8757 | g_loss: 2.6631
Epoch [   11/   50] | d_loss: 0.7379 | g_loss: 1.3567
Epoch [   11/   50] | d_loss: 0.7519 | g_loss: 2.3009
Epoch [   11/   50] | d_loss: 1.3688 | g_loss: 3.3065
Epoch [   11/   50] | d_loss: 0.5333 | g_loss: 2.7235
Epoch [   11/   50] | d_loss: 0.5500 | g_loss: 2.2152
Epoch [   11/   50] | d_loss: 0.7738 | g_loss: 1.9323
Epoch [   11/   50] | d_loss: 0.4875 | g_loss: 1.9120
Epoch [   11/   50] | d_loss: 0.7536 | g_loss: 2.4136
Epoch [   11/   50] | d_loss: 1.0226 | g_loss: 1.5841
Epoch [   11/   50] | d_loss: 1.0623 | g_loss: 3.2803
Epoch [   11/   50] | d_loss: 0.7834 | g_loss: 2.0405
Epoch [   11/   50] | d_loss: 0.8109 | g_loss: 3.5274
Epoch [   11/   50] | d_loss: 0.6965 | g_loss: 3.0885
Epoch [   11/   50] | d_loss: 0.6562 | g_loss: 2.7072
Epoch [   11/   50] | d_loss: 0.8836 | g_loss: 2.5029
Epoch [   11/   50] | d_loss: 0.7245 | g_loss: 2.0101
Epoch [   11/   50] | d_loss: 0.8579 | g_loss: 1.8718
Epoch [   11/   50] | d_loss: 0.9970 | g_loss: 1.9304
Epoch [   11/   50] | d_loss: 0.8839 | g_loss: 1.1244
Epoch [   11/   50] | d_loss: 0.5830 | g_loss: 1.9919
Epoch [   11/   50] | d_loss: 0.8500 | g_loss: 1.9439
Epoch [   11/   50] | d_loss: 0.6304 | g_loss: 1.1205
Epoch [   11/   50] | d_loss: 1.0816 | g_loss: 2.2743
Epoch [   11/   50] | d_loss: 1.0118 | g_loss: 2.4382
Epoch [   11/   50] | d_loss: 0.6770 | g_loss: 3.0772
Epoch [   11/   50] | d_loss: 0.9006 | g_loss: 1.3030
Epoch [   11/   50] | d_loss: 0.7572 | g_loss: 2.4303
Epoch [   11/   50] | d_loss: 0.8203 | g_loss: 2.4267
Epoch [   11/   50] | d_loss: 0.6084 | g_loss: 3.6798
Epoch [   11/   50] | d_loss: 0.9571 | g_loss: 2.4433
Epoch [   11/   50] | d_loss: 0.6348 | g_loss: 1.2621
Epoch [   11/   50] | d_loss: 0.6326 | g_loss: 2.8395
Epoch [   11/   50] | d_loss: 0.7308 | g_loss: 2.1758
Epoch [   11/   50] | d_loss: 0.7556 | g_loss: 3.2350
Epoch [   12/   50] | d_loss: 0.6459 | g_loss: 3.2162
Epoch [   12/   50] | d_loss: 0.7058 | g_loss: 2.5707
Epoch [   12/   50] | d_loss: 0.6471 | g_loss: 2.4709
Epoch [   12/   50] | d_loss: 1.0366 | g_loss: 1.4184
Epoch [   12/   50] | d_loss: 0.9844 | g_loss: 2.5614
Epoch [   12/   50] | d_loss: 1.0248 | g_loss: 2.3020
Epoch [   12/   50] | d_loss: 0.6447 | g_loss: 1.9922
Epoch [   12/   50] | d_loss: 0.6399 | g_loss: 2.2018
Epoch [   12/   50] | d_loss: 0.5251 | g_loss: 2.5342
Epoch [   12/   50] | d_loss: 0.8918 | g_loss: 1.6461
Epoch [   12/   50] | d_loss: 0.7170 | g_loss: 2.2240
Epoch [   12/   50] | d_loss: 0.7420 | g_loss: 2.7000
Epoch [   12/   50] | d_loss: 0.7755 | g_loss: 1.6634
Epoch [   12/   50] | d_loss: 0.5433 | g_loss: 2.2155
Epoch [   12/   50] | d_loss: 1.0282 | g_loss: 1.2599
Epoch [   12/   50] | d_loss: 0.5909 | g_loss: 2.2889
Epoch [   12/   50] | d_loss: 1.2033 | g_loss: 1.4949
Epoch [   12/   50] | d_loss: 0.7688 | g_loss: 2.6622
Epoch [   12/   50] | d_loss: 0.5268 | g_loss: 2.4977
Epoch [   12/   50] | d_loss: 0.8273 | g_loss: 1.7096
Epoch [   12/   50] | d_loss: 0.5859 | g_loss: 1.8939
Epoch [   12/   50] | d_loss: 0.7767 | g_loss: 3.0687
Epoch [   12/   50] | d_loss: 1.1563 | g_loss: 1.3502
Epoch [   12/   50] | d_loss: 0.9267 | g_loss: 2.0270
Epoch [   12/   50] | d_loss: 0.7715 | g_loss: 2.3408
Epoch [   12/   50] | d_loss: 0.6700 | g_loss: 1.6897
Epoch [   12/   50] | d_loss: 0.8883 | g_loss: 2.1330
Epoch [   12/   50] | d_loss: 0.8169 | g_loss: 2.3161
Epoch [   12/   50] | d_loss: 0.6065 | g_loss: 2.3931
Epoch [   12/   50] | d_loss: 2.4411 | g_loss: 0.9903
Epoch [   12/   50] | d_loss: 0.4693 | g_loss: 2.3395
Epoch [   12/   50] | d_loss: 1.2190 | g_loss: 4.0240
Epoch [   12/   50] | d_loss: 0.7447 | g_loss: 2.1337
Epoch [   12/   50] | d_loss: 0.8610 | g_loss: 1.4905
Epoch [   12/   50] | d_loss: 1.0559 | g_loss: 1.2736
Epoch [   12/   50] | d_loss: 0.6400 | g_loss: 2.3116
Epoch [   13/   50] | d_loss: 0.8579 | g_loss: 1.7067
Epoch [   13/   50] | d_loss: 0.9585 | g_loss: 2.8800
Epoch [   13/   50] | d_loss: 0.5865 | g_loss: 2.5645
Epoch [   13/   50] | d_loss: 0.7170 | g_loss: 1.7831
Epoch [   13/   50] | d_loss: 0.9315 | g_loss: 1.8040
Epoch [   13/   50] | d_loss: 0.5852 | g_loss: 2.8165
Epoch [   13/   50] | d_loss: 0.6616 | g_loss: 1.9863
Epoch [   13/   50] | d_loss: 0.5719 | g_loss: 2.8667
Epoch [   13/   50] | d_loss: 0.6903 | g_loss: 1.6234
Epoch [   13/   50] | d_loss: 1.0401 | g_loss: 1.0788
Epoch [   13/   50] | d_loss: 0.7458 | g_loss: 2.4902
Epoch [   13/   50] | d_loss: 0.5229 | g_loss: 2.4443
Epoch [   13/   50] | d_loss: 0.4451 | g_loss: 3.4698
Epoch [   13/   50] | d_loss: 0.8371 | g_loss: 2.8296
Epoch [   13/   50] | d_loss: 0.6191 | g_loss: 2.7511
Epoch [   13/   50] | d_loss: 0.7716 | g_loss: 2.6375
Epoch [   13/   50] | d_loss: 0.5776 | g_loss: 2.1979
Epoch [   13/   50] | d_loss: 0.7057 | g_loss: 2.6576
Epoch [   13/   50] | d_loss: 0.6244 | g_loss: 1.9959
Epoch [   13/   50] | d_loss: 0.7381 | g_loss: 2.7642
Epoch [   13/   50] | d_loss: 0.6135 | g_loss: 3.5796
Epoch [   13/   50] | d_loss: 0.7912 | g_loss: 2.4934
Epoch [   13/   50] | d_loss: 0.5350 | g_loss: 2.5061
Epoch [   13/   50] | d_loss: 0.6043 | g_loss: 2.3444
Epoch [   13/   50] | d_loss: 1.2060 | g_loss: 1.1197
Epoch [   13/   50] | d_loss: 1.0885 | g_loss: 2.4841
Epoch [   13/   50] | d_loss: 0.7173 | g_loss: 1.4550
Epoch [   13/   50] | d_loss: 0.7429 | g_loss: 2.2454
Epoch [   13/   50] | d_loss: 0.4866 | g_loss: 2.0774
Epoch [   13/   50] | d_loss: 0.5337 | g_loss: 1.9693
Epoch [   13/   50] | d_loss: 1.3286 | g_loss: 1.2555
Epoch [   13/   50] | d_loss: 0.5605 | g_loss: 1.7672
Epoch [   13/   50] | d_loss: 0.7063 | g_loss: 2.5002
Epoch [   13/   50] | d_loss: 0.7788 | g_loss: 1.6158
Epoch [   13/   50] | d_loss: 0.8916 | g_loss: 1.8021
Epoch [   13/   50] | d_loss: 0.5606 | g_loss: 1.6211
Epoch [   14/   50] | d_loss: 0.8462 | g_loss: 3.0112
Epoch [   14/   50] | d_loss: 0.6553 | g_loss: 2.4472
Epoch [   14/   50] | d_loss: 0.5411 | g_loss: 1.6886
Epoch [   14/   50] | d_loss: 0.6447 | g_loss: 2.2636
Epoch [   14/   50] | d_loss: 0.5501 | g_loss: 1.7004
Epoch [   14/   50] | d_loss: 0.6560 | g_loss: 3.2320
Epoch [   14/   50] | d_loss: 0.6881 | g_loss: 2.6262
Epoch [   14/   50] | d_loss: 0.5589 | g_loss: 3.1819
Epoch [   14/   50] | d_loss: 0.5049 | g_loss: 2.3476
Epoch [   14/   50] | d_loss: 1.2343 | g_loss: 1.7562
Epoch [   14/   50] | d_loss: 0.7675 | g_loss: 2.2055
Epoch [   14/   50] | d_loss: 0.8015 | g_loss: 1.9650
Epoch [   14/   50] | d_loss: 0.4432 | g_loss: 3.4143
Epoch [   14/   50] | d_loss: 1.0043 | g_loss: 2.0073
Epoch [   14/   50] | d_loss: 0.6952 | g_loss: 1.8777
Epoch [   14/   50] | d_loss: 0.7674 | g_loss: 2.2016
Epoch [   14/   50] | d_loss: 0.4954 | g_loss: 3.1910
Epoch [   14/   50] | d_loss: 0.4811 | g_loss: 3.7960
Epoch [   14/   50] | d_loss: 0.6231 | g_loss: 2.4022
Epoch [   14/   50] | d_loss: 0.9373 | g_loss: 2.5384
Epoch [   14/   50] | d_loss: 1.2034 | g_loss: 2.7245
Epoch [   14/   50] | d_loss: 1.0287 | g_loss: 3.4461
Epoch [   14/   50] | d_loss: 0.6910 | g_loss: 3.6145
Epoch [   14/   50] | d_loss: 0.6013 | g_loss: 2.4102
Epoch [   14/   50] | d_loss: 0.6077 | g_loss: 1.6465
Epoch [   14/   50] | d_loss: 0.8275 | g_loss: 1.7590
Epoch [   14/   50] | d_loss: 0.7035 | g_loss: 2.7932
Epoch [   14/   50] | d_loss: 1.1178 | g_loss: 0.8348
Epoch [   14/   50] | d_loss: 0.8815 | g_loss: 3.7454
Epoch [   14/   50] | d_loss: 0.7737 | g_loss: 1.8671
Epoch [   14/   50] | d_loss: 0.6996 | g_loss: 1.2417
Epoch [   14/   50] | d_loss: 0.7388 | g_loss: 2.5315
Epoch [   14/   50] | d_loss: 0.7175 | g_loss: 2.6184
Epoch [   14/   50] | d_loss: 0.7335 | g_loss: 1.7832
Epoch [   14/   50] | d_loss: 0.6381 | g_loss: 3.0704
Epoch [   14/   50] | d_loss: 0.9064 | g_loss: 1.3749
Epoch [   15/   50] | d_loss: 0.9457 | g_loss: 2.5528
Epoch [   15/   50] | d_loss: 0.4598 | g_loss: 2.0767
Epoch [   15/   50] | d_loss: 1.0866 | g_loss: 2.9466
Epoch [   15/   50] | d_loss: 0.3972 | g_loss: 3.2743
Epoch [   15/   50] | d_loss: 0.7802 | g_loss: 1.8716
Epoch [   15/   50] | d_loss: 0.6148 | g_loss: 3.2587
Epoch [   15/   50] | d_loss: 0.4944 | g_loss: 2.3009
Epoch [   15/   50] | d_loss: 0.7179 | g_loss: 2.3545
Epoch [   15/   50] | d_loss: 0.7292 | g_loss: 2.0601
Epoch [   15/   50] | d_loss: 0.4990 | g_loss: 2.1443
Epoch [   15/   50] | d_loss: 0.4654 | g_loss: 1.7431
Epoch [   15/   50] | d_loss: 0.5006 | g_loss: 2.2266
Epoch [   15/   50] | d_loss: 0.5172 | g_loss: 1.3190
Epoch [   15/   50] | d_loss: 0.8247 | g_loss: 2.7606
Epoch [   15/   50] | d_loss: 0.7034 | g_loss: 2.4136
Epoch [   15/   50] | d_loss: 0.7185 | g_loss: 2.2965
Epoch [   15/   50] | d_loss: 0.6218 | g_loss: 2.4103
Epoch [   15/   50] | d_loss: 0.8043 | g_loss: 2.7077
Epoch [   15/   50] | d_loss: 0.4571 | g_loss: 3.4796
Epoch [   15/   50] | d_loss: 0.5492 | g_loss: 3.8836
Epoch [   15/   50] | d_loss: 1.9639 | g_loss: 2.1847
Epoch [   15/   50] | d_loss: 0.6781 | g_loss: 3.6334
Epoch [   15/   50] | d_loss: 0.7296 | g_loss: 2.7566
Epoch [   15/   50] | d_loss: 0.7317 | g_loss: 1.8913
Epoch [   15/   50] | d_loss: 0.8320 | g_loss: 2.5128
Epoch [   15/   50] | d_loss: 0.5474 | g_loss: 2.3913
Epoch [   15/   50] | d_loss: 0.4586 | g_loss: 2.3929
Epoch [   15/   50] | d_loss: 0.5547 | g_loss: 2.3364
Epoch [   15/   50] | d_loss: 0.4574 | g_loss: 2.7605
Epoch [   15/   50] | d_loss: 0.5419 | g_loss: 2.9784
Epoch [   15/   50] | d_loss: 0.5489 | g_loss: 3.5757
Epoch [   15/   50] | d_loss: 0.7009 | g_loss: 2.4677
Epoch [   15/   50] | d_loss: 0.7314 | g_loss: 2.2606
Epoch [   15/   50] | d_loss: 0.8880 | g_loss: 3.3699
Epoch [   15/   50] | d_loss: 0.8717 | g_loss: 0.9894
Epoch [   15/   50] | d_loss: 0.6392 | g_loss: 2.1183
Epoch [   16/   50] | d_loss: 0.7120 | g_loss: 2.5708
Epoch [   16/   50] | d_loss: 0.6088 | g_loss: 2.4571
Epoch [   16/   50] | d_loss: 0.4930 | g_loss: 3.6928
Epoch [   16/   50] | d_loss: 0.6691 | g_loss: 2.2829
Epoch [   16/   50] | d_loss: 0.9951 | g_loss: 1.8674
Epoch [   16/   50] | d_loss: 0.5039 | g_loss: 1.9016
Epoch [   16/   50] | d_loss: 0.6086 | g_loss: 2.6939
Epoch [   16/   50] | d_loss: 0.7364 | g_loss: 1.7846
Epoch [   16/   50] | d_loss: 0.5317 | g_loss: 2.2236
Epoch [   16/   50] | d_loss: 0.8184 | g_loss: 3.3773
Epoch [   16/   50] | d_loss: 0.5665 | g_loss: 3.0442
Epoch [   16/   50] | d_loss: 0.6609 | g_loss: 3.1419
Epoch [   16/   50] | d_loss: 0.5102 | g_loss: 3.0505
Epoch [   16/   50] | d_loss: 0.7811 | g_loss: 3.1167
Epoch [   16/   50] | d_loss: 0.4854 | g_loss: 1.7617
Epoch [   16/   50] | d_loss: 0.7077 | g_loss: 2.2924
Epoch [   16/   50] | d_loss: 0.5810 | g_loss: 2.7812
Epoch [   16/   50] | d_loss: 0.5181 | g_loss: 2.3389
Epoch [   16/   50] | d_loss: 0.9077 | g_loss: 4.0121
Epoch [   16/   50] | d_loss: 0.8107 | g_loss: 1.6910
Epoch [   16/   50] | d_loss: 0.6991 | g_loss: 2.3157
Epoch [   16/   50] | d_loss: 0.5321 | g_loss: 2.9367
Epoch [   16/   50] | d_loss: 0.5945 | g_loss: 2.7577
Epoch [   16/   50] | d_loss: 0.4947 | g_loss: 3.4437
Epoch [   16/   50] | d_loss: 0.6198 | g_loss: 2.0321
Epoch [   16/   50] | d_loss: 0.4726 | g_loss: 1.7640
Epoch [   16/   50] | d_loss: 0.4658 | g_loss: 3.0088
Epoch [   16/   50] | d_loss: 0.7109 | g_loss: 3.5479
Epoch [   16/   50] | d_loss: 0.5472 | g_loss: 3.5657
Epoch [   16/   50] | d_loss: 0.7564 | g_loss: 2.7510
Epoch [   16/   50] | d_loss: 0.6364 | g_loss: 2.6743
Epoch [   16/   50] | d_loss: 0.5820 | g_loss: 2.0603
Epoch [   16/   50] | d_loss: 0.8374 | g_loss: 3.8101
Epoch [   16/   50] | d_loss: 0.6233 | g_loss: 2.7501
Epoch [   16/   50] | d_loss: 0.5810 | g_loss: 1.4913
Epoch [   16/   50] | d_loss: 0.4864 | g_loss: 3.5401
Epoch [   17/   50] | d_loss: 0.9847 | g_loss: 1.1765
Epoch [   17/   50] | d_loss: 0.4669 | g_loss: 3.7289
Epoch [   17/   50] | d_loss: 0.7474 | g_loss: 2.1232
Epoch [   17/   50] | d_loss: 0.8262 | g_loss: 2.3247
Epoch [   17/   50] | d_loss: 0.5961 | g_loss: 2.7549
Epoch [   17/   50] | d_loss: 0.6014 | g_loss: 1.7201
Epoch [   17/   50] | d_loss: 0.9459 | g_loss: 2.3244
Epoch [   17/   50] | d_loss: 0.5020 | g_loss: 2.8397
Epoch [   17/   50] | d_loss: 0.5062 | g_loss: 1.5654
Epoch [   17/   50] | d_loss: 0.5673 | g_loss: 2.1670
Epoch [   17/   50] | d_loss: 0.7153 | g_loss: 2.5557
Epoch [   17/   50] | d_loss: 0.4966 | g_loss: 2.1735
Epoch [   17/   50] | d_loss: 0.5215 | g_loss: 2.3360
Epoch [   17/   50] | d_loss: 0.5193 | g_loss: 2.6425
Epoch [   17/   50] | d_loss: 0.4857 | g_loss: 3.1703
Epoch [   17/   50] | d_loss: 0.5402 | g_loss: 3.5422
Epoch [   17/   50] | d_loss: 0.4847 | g_loss: 4.8199
Epoch [   17/   50] | d_loss: 0.6002 | g_loss: 4.5906
Epoch [   17/   50] | d_loss: 0.8792 | g_loss: 1.3501
Epoch [   17/   50] | d_loss: 1.1458 | g_loss: 3.9035
Epoch [   17/   50] | d_loss: 1.0698 | g_loss: 2.7889
Epoch [   17/   50] | d_loss: 0.5620 | g_loss: 3.4533
Epoch [   17/   50] | d_loss: 1.1243 | g_loss: 0.7769
Epoch [   17/   50] | d_loss: 0.6637 | g_loss: 2.0920
Epoch [   17/   50] | d_loss: 0.5271 | g_loss: 3.8110
Epoch [   17/   50] | d_loss: 0.6122 | g_loss: 2.8647
Epoch [   17/   50] | d_loss: 0.4666 | g_loss: 2.8505
Epoch [   17/   50] | d_loss: 0.8784 | g_loss: 4.2843
Epoch [   17/   50] | d_loss: 0.8140 | g_loss: 2.0623
Epoch [   17/   50] | d_loss: 0.4913 | g_loss: 3.2782
Epoch [   17/   50] | d_loss: 0.7729 | g_loss: 1.9727
Epoch [   17/   50] | d_loss: 0.6731 | g_loss: 2.2275
Epoch [   17/   50] | d_loss: 0.6455 | g_loss: 2.4011
Epoch [   17/   50] | d_loss: 0.4281 | g_loss: 2.6330
Epoch [   17/   50] | d_loss: 0.4671 | g_loss: 3.5893
Epoch [   17/   50] | d_loss: 0.7679 | g_loss: 2.9758
Epoch [   18/   50] | d_loss: 0.8094 | g_loss: 2.6888
Epoch [   18/   50] | d_loss: 0.5194 | g_loss: 2.7482
Epoch [   18/   50] | d_loss: 0.4684 | g_loss: 2.4540
Epoch [   18/   50] | d_loss: 0.4415 | g_loss: 3.4516
Epoch [   18/   50] | d_loss: 0.6890 | g_loss: 3.0906
Epoch [   18/   50] | d_loss: 0.5775 | g_loss: 1.7638
Epoch [   18/   50] | d_loss: 0.7048 | g_loss: 1.8951
Epoch [   18/   50] | d_loss: 0.4897 | g_loss: 3.5437
Epoch [   18/   50] | d_loss: 0.7491 | g_loss: 1.5159
Epoch [   18/   50] | d_loss: 0.7748 | g_loss: 1.8302
Epoch [   18/   50] | d_loss: 0.6710 | g_loss: 2.4991
Epoch [   18/   50] | d_loss: 0.5196 | g_loss: 3.6574
Epoch [   18/   50] | d_loss: 0.6350 | g_loss: 2.0136
Epoch [   18/   50] | d_loss: 0.6562 | g_loss: 1.6031
Epoch [   18/   50] | d_loss: 0.5944 | g_loss: 3.6660
Epoch [   18/   50] | d_loss: 0.5066 | g_loss: 3.0995
Epoch [   18/   50] | d_loss: 0.5365 | g_loss: 1.9533
Epoch [   18/   50] | d_loss: 0.5315 | g_loss: 2.3014
Epoch [   18/   50] | d_loss: 0.5277 | g_loss: 2.8220
Epoch [   18/   50] | d_loss: 0.4746 | g_loss: 3.5211
Epoch [   18/   50] | d_loss: 0.6420 | g_loss: 1.8028
Epoch [   18/   50] | d_loss: 0.7656 | g_loss: 2.4284
Epoch [   18/   50] | d_loss: 0.6404 | g_loss: 3.5459
Epoch [   18/   50] | d_loss: 0.4796 | g_loss: 4.0099
Epoch [   18/   50] | d_loss: 0.5803 | g_loss: 3.5553
Epoch [   18/   50] | d_loss: 0.9438 | g_loss: 1.0095
Epoch [   18/   50] | d_loss: 0.5949 | g_loss: 2.8001
Epoch [   18/   50] | d_loss: 0.7464 | g_loss: 3.2592
Epoch [   18/   50] | d_loss: 0.5162 | g_loss: 2.8936
Epoch [   18/   50] | d_loss: 0.5032 | g_loss: 3.0717
Epoch [   18/   50] | d_loss: 0.6297 | g_loss: 2.3546
Epoch [   18/   50] | d_loss: 0.5253 | g_loss: 2.5073
Epoch [   18/   50] | d_loss: 0.6664 | g_loss: 3.1665
Epoch [   18/   50] | d_loss: 1.1953 | g_loss: 0.9458
Epoch [   18/   50] | d_loss: 0.8227 | g_loss: 2.0612
Epoch [   18/   50] | d_loss: 0.5445 | g_loss: 2.3161
Epoch [   19/   50] | d_loss: 0.4287 | g_loss: 3.5169
Epoch [   19/   50] | d_loss: 0.4722 | g_loss: 2.5414
Epoch [   19/   50] | d_loss: 0.4424 | g_loss: 3.1017
Epoch [   19/   50] | d_loss: 0.6417 | g_loss: 2.6300
Epoch [   19/   50] | d_loss: 0.5733 | g_loss: 3.5814
Epoch [   19/   50] | d_loss: 0.6311 | g_loss: 2.3831
Epoch [   19/   50] | d_loss: 0.4734 | g_loss: 3.3211
Epoch [   19/   50] | d_loss: 0.5530 | g_loss: 3.5368
Epoch [   19/   50] | d_loss: 0.5726 | g_loss: 2.7173
Epoch [   19/   50] | d_loss: 0.6869 | g_loss: 2.4157
Epoch [   19/   50] | d_loss: 0.7106 | g_loss: 3.7400
Epoch [   19/   50] | d_loss: 0.5134 | g_loss: 2.8200
Epoch [   19/   50] | d_loss: 0.6389 | g_loss: 3.2156
Epoch [   19/   50] | d_loss: 0.8713 | g_loss: 3.3040
Epoch [   19/   50] | d_loss: 0.5373 | g_loss: 2.8554
Epoch [   19/   50] | d_loss: 0.7149 | g_loss: 2.7954
Epoch [   19/   50] | d_loss: 0.5745 | g_loss: 2.9728
Epoch [   19/   50] | d_loss: 0.4862 | g_loss: 2.5991
Epoch [   19/   50] | d_loss: 0.5671 | g_loss: 3.8860
Epoch [   19/   50] | d_loss: 0.5204 | g_loss: 2.1306
Epoch [   19/   50] | d_loss: 0.5691 | g_loss: 2.0899
Epoch [   19/   50] | d_loss: 0.5497 | g_loss: 2.4633
Epoch [   19/   50] | d_loss: 0.6017 | g_loss: 3.2536
Epoch [   19/   50] | d_loss: 1.3576 | g_loss: 0.6823
Epoch [   19/   50] | d_loss: 0.7301 | g_loss: 1.8659
Epoch [   19/   50] | d_loss: 0.5986 | g_loss: 2.2436
Epoch [   19/   50] | d_loss: 0.5146 | g_loss: 1.9114
Epoch [   19/   50] | d_loss: 0.9020 | g_loss: 4.4074
Epoch [   19/   50] | d_loss: 0.6558 | g_loss: 2.1292
Epoch [   19/   50] | d_loss: 0.7846 | g_loss: 2.6129
Epoch [   19/   50] | d_loss: 0.6059 | g_loss: 2.1000
Epoch [   19/   50] | d_loss: 0.9981 | g_loss: 1.9256
Epoch [   19/   50] | d_loss: 0.4380 | g_loss: 3.5803
Epoch [   19/   50] | d_loss: 0.5162 | g_loss: 3.2560
Epoch [   19/   50] | d_loss: 0.5348 | g_loss: 2.9663
Epoch [   19/   50] | d_loss: 0.6510 | g_loss: 1.1650
Epoch [   20/   50] | d_loss: 0.5333 | g_loss: 1.8970
Epoch [   20/   50] | d_loss: 0.3970 | g_loss: 3.9424
Epoch [   20/   50] | d_loss: 0.9384 | g_loss: 2.3252
Epoch [   20/   50] | d_loss: 0.5845 | g_loss: 2.6758
Epoch [   20/   50] | d_loss: 0.6861 | g_loss: 2.6482
Epoch [   20/   50] | d_loss: 0.6930 | g_loss: 2.9294
Epoch [   20/   50] | d_loss: 0.6954 | g_loss: 2.5364
Epoch [   20/   50] | d_loss: 0.5556 | g_loss: 2.3960
Epoch [   20/   50] | d_loss: 0.5317 | g_loss: 1.9077
Epoch [   20/   50] | d_loss: 0.5819 | g_loss: 2.2684
Epoch [   20/   50] | d_loss: 0.7240 | g_loss: 2.9076
Epoch [   20/   50] | d_loss: 0.6882 | g_loss: 2.0505
Epoch [   20/   50] | d_loss: 0.6095 | g_loss: 2.5515
Epoch [   20/   50] | d_loss: 0.6887 | g_loss: 2.0097
Epoch [   20/   50] | d_loss: 0.5720 | g_loss: 2.7797
Epoch [   20/   50] | d_loss: 0.5341 | g_loss: 2.9385
Epoch [   20/   50] | d_loss: 0.4488 | g_loss: 4.1585
Epoch [   20/   50] | d_loss: 0.4909 | g_loss: 2.5934
Epoch [   20/   50] | d_loss: 0.6208 | g_loss: 2.4634
Epoch [   20/   50] | d_loss: 0.5832 | g_loss: 4.0538
Epoch [   20/   50] | d_loss: 0.5318 | g_loss: 1.4505
Epoch [   20/   50] | d_loss: 0.6308 | g_loss: 2.7264
Epoch [   20/   50] | d_loss: 0.9138 | g_loss: 1.8822
Epoch [   20/   50] | d_loss: 0.5345 | g_loss: 4.2654
Epoch [   20/   50] | d_loss: 0.9661 | g_loss: 1.2941
Epoch [   20/   50] | d_loss: 0.6797 | g_loss: 2.1653
Epoch [   20/   50] | d_loss: 0.7114 | g_loss: 2.1857
Epoch [   20/   50] | d_loss: 0.5442 | g_loss: 3.2379
Epoch [   20/   50] | d_loss: 0.5978 | g_loss: 2.8698
Epoch [   20/   50] | d_loss: 0.6723 | g_loss: 1.4527
Epoch [   20/   50] | d_loss: 0.8683 | g_loss: 3.0813
Epoch [   20/   50] | d_loss: 0.5088 | g_loss: 3.9441
Epoch [   20/   50] | d_loss: 0.7156 | g_loss: 2.6669
Epoch [   20/   50] | d_loss: 0.4576 | g_loss: 3.5417
Epoch [   20/   50] | d_loss: 0.5066 | g_loss: 2.8997
Epoch [   20/   50] | d_loss: 0.6474 | g_loss: 2.3347
Epoch [   21/   50] | d_loss: 0.6092 | g_loss: 3.6696
Epoch [   21/   50] | d_loss: 0.7036 | g_loss: 2.9541
Epoch [   21/   50] | d_loss: 0.7132 | g_loss: 2.2798
Epoch [   21/   50] | d_loss: 0.4623 | g_loss: 4.2929
Epoch [   21/   50] | d_loss: 0.4144 | g_loss: 3.9899
Epoch [   21/   50] | d_loss: 0.6191 | g_loss: 2.4010
Epoch [   21/   50] | d_loss: 0.5523 | g_loss: 4.0502
Epoch [   21/   50] | d_loss: 0.6891 | g_loss: 2.1771
Epoch [   21/   50] | d_loss: 0.5578 | g_loss: 2.6275
Epoch [   21/   50] | d_loss: 0.5062 | g_loss: 2.4928
Epoch [   21/   50] | d_loss: 0.4698 | g_loss: 2.7983
Epoch [   21/   50] | d_loss: 0.5389 | g_loss: 3.2068
Epoch [   21/   50] | d_loss: 0.4755 | g_loss: 1.9832
Epoch [   21/   50] | d_loss: 0.5183 | g_loss: 2.7195
Epoch [   21/   50] | d_loss: 1.0766 | g_loss: 1.4037
Epoch [   21/   50] | d_loss: 0.4876 | g_loss: 3.6040
Epoch [   21/   50] | d_loss: 0.7911 | g_loss: 2.3010
Epoch [   21/   50] | d_loss: 0.6061 | g_loss: 2.2600
Epoch [   21/   50] | d_loss: 0.4944 | g_loss: 2.9853
Epoch [   21/   50] | d_loss: 0.5184 | g_loss: 2.6453
Epoch [   21/   50] | d_loss: 1.1115 | g_loss: 2.1344
Epoch [   21/   50] | d_loss: 0.6059 | g_loss: 2.4280
Epoch [   21/   50] | d_loss: 0.5839 | g_loss: 3.1335
Epoch [   21/   50] | d_loss: 0.6156 | g_loss: 3.0128
Epoch [   21/   50] | d_loss: 0.5481 | g_loss: 2.4022
Epoch [   21/   50] | d_loss: 0.5188 | g_loss: 1.6938
Epoch [   21/   50] | d_loss: 0.8202 | g_loss: 2.2687
Epoch [   21/   50] | d_loss: 0.6935 | g_loss: 1.8675
Epoch [   21/   50] | d_loss: 0.4884 | g_loss: 2.9234
Epoch [   21/   50] | d_loss: 0.4729 | g_loss: 3.0130
Epoch [   21/   50] | d_loss: 0.3826 | g_loss: 4.3344
Epoch [   21/   50] | d_loss: 0.4465 | g_loss: 2.9535
Epoch [   21/   50] | d_loss: 0.5680 | g_loss: 3.0350
Epoch [   21/   50] | d_loss: 0.5510 | g_loss: 3.3719
Epoch [   21/   50] | d_loss: 0.4496 | g_loss: 3.9872
Epoch [   21/   50] | d_loss: 0.4356 | g_loss: 3.7139
Epoch [   22/   50] | d_loss: 2.1322 | g_loss: 4.0749
Epoch [   22/   50] | d_loss: 0.5808 | g_loss: 2.3551
Epoch [   22/   50] | d_loss: 0.5195 | g_loss: 3.5460
Epoch [   22/   50] | d_loss: 0.4187 | g_loss: 3.4761
Epoch [   22/   50] | d_loss: 0.5156 | g_loss: 1.9007
Epoch [   22/   50] | d_loss: 0.5649 | g_loss: 2.3173
Epoch [   22/   50] | d_loss: 0.4347 | g_loss: 3.7375
Epoch [   22/   50] | d_loss: 0.9603 | g_loss: 1.6106
Epoch [   22/   50] | d_loss: 0.5377 | g_loss: 2.8500
Epoch [   22/   50] | d_loss: 0.5793 | g_loss: 2.3753
Epoch [   22/   50] | d_loss: 0.4585 | g_loss: 2.9671
Epoch [   22/   50] | d_loss: 0.4144 | g_loss: 3.2272
Epoch [   22/   50] | d_loss: 0.5843 | g_loss: 2.5136
Epoch [   22/   50] | d_loss: 0.6015 | g_loss: 2.3023
Epoch [   22/   50] | d_loss: 0.4860 | g_loss: 4.1839
Epoch [   22/   50] | d_loss: 0.7175 | g_loss: 1.9671
Epoch [   22/   50] | d_loss: 0.5586 | g_loss: 1.7911
Epoch [   22/   50] | d_loss: 0.7590 | g_loss: 1.9939
Epoch [   22/   50] | d_loss: 0.6342 | g_loss: 3.5510
Epoch [   22/   50] | d_loss: 0.4531 | g_loss: 3.8667
Epoch [   22/   50] | d_loss: 0.4026 | g_loss: 3.1411
Epoch [   22/   50] | d_loss: 0.7375 | g_loss: 4.4570
Epoch [   22/   50] | d_loss: 0.5308 | g_loss: 2.6503
Epoch [   22/   50] | d_loss: 0.4600 | g_loss: 3.2741
Epoch [   22/   50] | d_loss: 0.5893 | g_loss: 2.2567
Epoch [   22/   50] | d_loss: 0.8116 | g_loss: 1.0424
Epoch [   22/   50] | d_loss: 0.4836 | g_loss: 2.9877
Epoch [   22/   50] | d_loss: 0.4805 | g_loss: 3.4794
Epoch [   22/   50] | d_loss: 0.7206 | g_loss: 2.5017
Epoch [   22/   50] | d_loss: 0.6032 | g_loss: 1.3225
Epoch [   22/   50] | d_loss: 0.6187 | g_loss: 2.5562
Epoch [   22/   50] | d_loss: 0.5659 | g_loss: 2.9753
Epoch [   22/   50] | d_loss: 0.4787 | g_loss: 3.1092
Epoch [   22/   50] | d_loss: 0.5628 | g_loss: 2.0822
Epoch [   22/   50] | d_loss: 0.5848 | g_loss: 2.4847
Epoch [   22/   50] | d_loss: 0.5173 | g_loss: 3.8357
Epoch [   23/   50] | d_loss: 0.4984 | g_loss: 2.9901
Epoch [   23/   50] | d_loss: 0.4384 | g_loss: 3.7781
Epoch [   23/   50] | d_loss: 0.6289 | g_loss: 3.4326
Epoch [   23/   50] | d_loss: 1.1801 | g_loss: 2.5131
Epoch [   23/   50] | d_loss: 0.4523 | g_loss: 3.5048
Epoch [   23/   50] | d_loss: 0.8033 | g_loss: 3.1933
Epoch [   23/   50] | d_loss: 0.4167 | g_loss: 4.0677
Epoch [   23/   50] | d_loss: 0.8483 | g_loss: 2.9118
Epoch [   23/   50] | d_loss: 0.4021 | g_loss: 5.0686
Epoch [   23/   50] | d_loss: 0.7087 | g_loss: 3.9735
Epoch [   23/   50] | d_loss: 0.4427 | g_loss: 3.5796
Epoch [   23/   50] | d_loss: 1.0084 | g_loss: 2.2180
Epoch [   23/   50] | d_loss: 0.5139 | g_loss: 2.2349
Epoch [   23/   50] | d_loss: 0.5832 | g_loss: 2.0957
Epoch [   23/   50] | d_loss: 0.6751 | g_loss: 2.5115
Epoch [   23/   50] | d_loss: 0.8727 | g_loss: 1.9098
Epoch [   23/   50] | d_loss: 0.3982 | g_loss: 3.3830
Epoch [   23/   50] | d_loss: 0.5071 | g_loss: 2.2111
Epoch [   23/   50] | d_loss: 0.5019 | g_loss: 4.7559
Epoch [   23/   50] | d_loss: 0.4582 | g_loss: 4.7625
Epoch [   23/   50] | d_loss: 0.8158 | g_loss: 2.4302
Epoch [   23/   50] | d_loss: 0.4720 | g_loss: 3.9193
Epoch [   23/   50] | d_loss: 0.7991 | g_loss: 2.7944
Epoch [   23/   50] | d_loss: 0.4771 | g_loss: 2.7519
Epoch [   23/   50] | d_loss: 0.5809 | g_loss: 1.2805
Epoch [   23/   50] | d_loss: 0.4357 | g_loss: 2.5191
Epoch [   23/   50] | d_loss: 0.8362 | g_loss: 3.6599
Epoch [   23/   50] | d_loss: 0.5782 | g_loss: 2.7693
Epoch [   23/   50] | d_loss: 0.5547 | g_loss: 1.3867
Epoch [   23/   50] | d_loss: 0.7239 | g_loss: 3.1465
Epoch [   23/   50] | d_loss: 0.4363 | g_loss: 3.0466
Epoch [   23/   50] | d_loss: 0.4308 | g_loss: 3.1009
Epoch [   23/   50] | d_loss: 0.4878 | g_loss: 3.1023
Epoch [   23/   50] | d_loss: 0.8055 | g_loss: 1.1323
Epoch [   23/   50] | d_loss: 0.5166 | g_loss: 3.6713
Epoch [   23/   50] | d_loss: 0.5637 | g_loss: 2.5098
Epoch [   24/   50] | d_loss: 0.9305 | g_loss: 2.6524
Epoch [   24/   50] | d_loss: 0.7113 | g_loss: 3.0908
Epoch [   24/   50] | d_loss: 0.8787 | g_loss: 2.2971
Epoch [   24/   50] | d_loss: 0.7335 | g_loss: 3.6921
Epoch [   24/   50] | d_loss: 0.5228 | g_loss: 3.3584
Epoch [   24/   50] | d_loss: 0.6875 | g_loss: 2.9248
Epoch [   24/   50] | d_loss: 0.5325 | g_loss: 3.0088
Epoch [   24/   50] | d_loss: 1.1260 | g_loss: 2.9019
Epoch [   24/   50] | d_loss: 0.5452 | g_loss: 2.7987
Epoch [   24/   50] | d_loss: 0.4106 | g_loss: 3.5374
Epoch [   24/   50] | d_loss: 0.5064 | g_loss: 3.0070
Epoch [   24/   50] | d_loss: 0.4389 | g_loss: 3.6538
Epoch [   24/   50] | d_loss: 0.5393 | g_loss: 3.7380
Epoch [   24/   50] | d_loss: 0.8756 | g_loss: 2.3686
Epoch [   24/   50] | d_loss: 0.5324 | g_loss: 3.1654
Epoch [   24/   50] | d_loss: 0.5378 | g_loss: 2.6890
Epoch [   24/   50] | d_loss: 0.4383 | g_loss: 3.0498
Epoch [   24/   50] | d_loss: 0.4539 | g_loss: 3.1272
Epoch [   24/   50] | d_loss: 0.4283 | g_loss: 3.1153
Epoch [   24/   50] | d_loss: 0.4858 | g_loss: 2.4361
Epoch [   24/   50] | d_loss: 0.6927 | g_loss: 2.3074
Epoch [   24/   50] | d_loss: 0.4799 | g_loss: 3.4274
Epoch [   24/   50] | d_loss: 0.5559 | g_loss: 3.5844
Epoch [   24/   50] | d_loss: 0.7383 | g_loss: 5.1512
Epoch [   24/   50] | d_loss: 0.5342 | g_loss: 2.3084
Epoch [   24/   50] | d_loss: 0.5150 | g_loss: 3.0821
Epoch [   24/   50] | d_loss: 0.7316 | g_loss: 3.5249
Epoch [   24/   50] | d_loss: 0.5616 | g_loss: 2.1808
Epoch [   24/   50] | d_loss: 0.4380 | g_loss: 3.3729
Epoch [   24/   50] | d_loss: 0.5233 | g_loss: 2.9056
Epoch [   24/   50] | d_loss: 0.5410 | g_loss: 4.4471
Epoch [   24/   50] | d_loss: 0.5479 | g_loss: 3.8592
Epoch [   24/   50] | d_loss: 0.6737 | g_loss: 4.2847
Epoch [   24/   50] | d_loss: 1.5895 | g_loss: 2.7717
Epoch [   24/   50] | d_loss: 0.4888 | g_loss: 2.5400
Epoch [   24/   50] | d_loss: 0.8121 | g_loss: 1.3261
Epoch [   25/   50] | d_loss: 0.4498 | g_loss: 3.9092
Epoch [   25/   50] | d_loss: 0.5498 | g_loss: 2.4045
Epoch [   25/   50] | d_loss: 0.5264 | g_loss: 2.6269
Epoch [   25/   50] | d_loss: 0.4482 | g_loss: 2.4544
Epoch [   25/   50] | d_loss: 0.4997 | g_loss: 2.2097
Epoch [   25/   50] | d_loss: 0.4446 | g_loss: 2.6537
Epoch [   25/   50] | d_loss: 0.5107 | g_loss: 2.8281
Epoch [   25/   50] | d_loss: 0.5385 | g_loss: 2.2216
Epoch [   25/   50] | d_loss: 0.4894 | g_loss: 4.3980
Epoch [   25/   50] | d_loss: 0.5167 | g_loss: 3.3274
Epoch [   25/   50] | d_loss: 0.6810 | g_loss: 2.2006
Epoch [   25/   50] | d_loss: 0.6445 | g_loss: 2.0894
Epoch [   25/   50] | d_loss: 0.5878 | g_loss: 3.7809
Epoch [   25/   50] | d_loss: 0.6045 | g_loss: 2.4321
Epoch [   25/   50] | d_loss: 0.4231 | g_loss: 4.0970
Epoch [   25/   50] | d_loss: 0.5250 | g_loss: 2.7882
Epoch [   25/   50] | d_loss: 0.5095 | g_loss: 1.7490
Epoch [   25/   50] | d_loss: 0.4088 | g_loss: 3.3737
Epoch [   25/   50] | d_loss: 0.6475 | g_loss: 3.3854
Epoch [   25/   50] | d_loss: 0.4308 | g_loss: 3.5802
Epoch [   25/   50] | d_loss: 0.5795 | g_loss: 1.7498
Epoch [   25/   50] | d_loss: 0.4890 | g_loss: 3.4496
Epoch [   25/   50] | d_loss: 0.5661 | g_loss: 2.7989
Epoch [   25/   50] | d_loss: 0.5470 | g_loss: 2.6092
Epoch [   25/   50] | d_loss: 0.4569 | g_loss: 3.0226
Epoch [   25/   50] | d_loss: 0.9605 | g_loss: 2.3371
Epoch [   25/   50] | d_loss: 0.4511 | g_loss: 3.3021
Epoch [   25/   50] | d_loss: 0.6005 | g_loss: 3.8492
Epoch [   25/   50] | d_loss: 0.5735 | g_loss: 2.9588
Epoch [   25/   50] | d_loss: 0.6897 | g_loss: 2.0762
Epoch [   25/   50] | d_loss: 0.4567 | g_loss: 4.6157
Epoch [   25/   50] | d_loss: 0.8014 | g_loss: 2.8130
Epoch [   25/   50] | d_loss: 0.4765 | g_loss: 3.2181
Epoch [   25/   50] | d_loss: 0.6791 | g_loss: 2.5101
Epoch [   25/   50] | d_loss: 0.4641 | g_loss: 3.5011
Epoch [   25/   50] | d_loss: 0.6744 | g_loss: 2.2018
Epoch [   26/   50] | d_loss: 0.4594 | g_loss: 2.4808
Epoch [   26/   50] | d_loss: 0.5271 | g_loss: 2.5203
Epoch [   26/   50] | d_loss: 0.6252 | g_loss: 4.1199
Epoch [   26/   50] | d_loss: 0.4284 | g_loss: 3.0686
Epoch [   26/   50] | d_loss: 0.4316 | g_loss: 2.9843
Epoch [   26/   50] | d_loss: 0.5348 | g_loss: 3.0986
Epoch [   26/   50] | d_loss: 0.6792 | g_loss: 4.1985
Epoch [   26/   50] | d_loss: 0.5156 | g_loss: 3.2354
Epoch [   26/   50] | d_loss: 0.5301 | g_loss: 2.5107
Epoch [   26/   50] | d_loss: 0.4933 | g_loss: 2.2900
Epoch [   26/   50] | d_loss: 0.6794 | g_loss: 3.3456
Epoch [   26/   50] | d_loss: 0.4678 | g_loss: 2.7025
Epoch [   26/   50] | d_loss: 0.4693 | g_loss: 4.1708
Epoch [   26/   50] | d_loss: 0.4813 | g_loss: 2.1912
Epoch [   26/   50] | d_loss: 0.4164 | g_loss: 3.6146
Epoch [   26/   50] | d_loss: 0.4646 | g_loss: 2.5976
Epoch [   26/   50] | d_loss: 0.5270 | g_loss: 3.3458
Epoch [   26/   50] | d_loss: 0.4494 | g_loss: 3.7448
Epoch [   26/   50] | d_loss: 0.6607 | g_loss: 2.9776
Epoch [   26/   50] | d_loss: 0.4626 | g_loss: 3.2765
Epoch [   26/   50] | d_loss: 0.7037 | g_loss: 2.8538
Epoch [   26/   50] | d_loss: 0.4883 | g_loss: 1.6715
Epoch [   26/   50] | d_loss: 0.4926 | g_loss: 1.9378
Epoch [   26/   50] | d_loss: 0.5325 | g_loss: 3.7871
Epoch [   26/   50] | d_loss: 0.6139 | g_loss: 2.8075
Epoch [   26/   50] | d_loss: 0.5221 | g_loss: 3.7013
Epoch [   26/   50] | d_loss: 0.5659 | g_loss: 2.6483
Epoch [   26/   50] | d_loss: 0.8177 | g_loss: 2.6219
Epoch [   26/   50] | d_loss: 0.5964 | g_loss: 2.0543
Epoch [   26/   50] | d_loss: 0.4397 | g_loss: 3.4010
Epoch [   26/   50] | d_loss: 0.4753 | g_loss: 3.0471
Epoch [   26/   50] | d_loss: 0.4770 | g_loss: 3.3155
Epoch [   26/   50] | d_loss: 0.4463 | g_loss: 2.4016
Epoch [   26/   50] | d_loss: 0.5460 | g_loss: 2.0097
Epoch [   26/   50] | d_loss: 0.4995 | g_loss: 2.7391
Epoch [   26/   50] | d_loss: 0.5140 | g_loss: 2.6237
Epoch [   27/   50] | d_loss: 0.4112 | g_loss: 4.1223
Epoch [   27/   50] | d_loss: 0.6337 | g_loss: 2.3285
Epoch [   27/   50] | d_loss: 0.6567 | g_loss: 1.3675
Epoch [   27/   50] | d_loss: 0.4762 | g_loss: 2.4076
Epoch [   27/   50] | d_loss: 0.8942 | g_loss: 2.8781
Epoch [   27/   50] | d_loss: 0.9247 | g_loss: 2.0378
Epoch [   27/   50] | d_loss: 0.8023 | g_loss: 2.0299
Epoch [   27/   50] | d_loss: 0.5185 | g_loss: 4.6819
Epoch [   27/   50] | d_loss: 0.6765 | g_loss: 2.1547
Epoch [   27/   50] | d_loss: 0.5221 | g_loss: 2.9239
Epoch [   27/   50] | d_loss: 0.4246 | g_loss: 3.5782
Epoch [   27/   50] | d_loss: 0.7510 | g_loss: 2.5978
Epoch [   27/   50] | d_loss: 0.9283 | g_loss: 2.6063
Epoch [   27/   50] | d_loss: 0.5101 | g_loss: 3.9228
Epoch [   27/   50] | d_loss: 0.8937 | g_loss: 2.7142
Epoch [   27/   50] | d_loss: 0.4933 | g_loss: 2.7270
Epoch [   27/   50] | d_loss: 0.5054 | g_loss: 3.8153
Epoch [   27/   50] | d_loss: 0.9306 | g_loss: 2.0902
Epoch [   27/   50] | d_loss: 0.5491 | g_loss: 3.6532
Epoch [   27/   50] | d_loss: 0.5057 | g_loss: 2.6665
Epoch [   27/   50] | d_loss: 0.4122 | g_loss: 3.5403
Epoch [   27/   50] | d_loss: 0.4830 | g_loss: 4.2282
Epoch [   27/   50] | d_loss: 0.4014 | g_loss: 3.6967
Epoch [   27/   50] | d_loss: 0.5711 | g_loss: 2.4842
Epoch [   27/   50] | d_loss: 0.4384 | g_loss: 3.0531
Epoch [   27/   50] | d_loss: 0.5794 | g_loss: 2.2813
Epoch [   27/   50] | d_loss: 0.5823 | g_loss: 2.5454
Epoch [   27/   50] | d_loss: 0.3938 | g_loss: 2.7233
Epoch [   27/   50] | d_loss: 0.5913 | g_loss: 3.7228
Epoch [   27/   50] | d_loss: 0.4245 | g_loss: 3.9065
Epoch [   27/   50] | d_loss: 0.4960 | g_loss: 2.5716
Epoch [   27/   50] | d_loss: 0.7689 | g_loss: 4.3972
Epoch [   27/   50] | d_loss: 0.4845 | g_loss: 4.5951
Epoch [   27/   50] | d_loss: 0.3904 | g_loss: 2.6138
Epoch [   27/   50] | d_loss: 0.3887 | g_loss: 3.8442
Epoch [   27/   50] | d_loss: 0.5455 | g_loss: 2.6630
Epoch [   28/   50] | d_loss: 0.7701 | g_loss: 1.6180
Epoch [   28/   50] | d_loss: 0.5493 | g_loss: 3.0378
Epoch [   28/   50] | d_loss: 0.4675 | g_loss: 2.4412
Epoch [   28/   50] | d_loss: 0.4983 | g_loss: 4.2410
Epoch [   28/   50] | d_loss: 0.4946 | g_loss: 2.9786
Epoch [   28/   50] | d_loss: 0.6132 | g_loss: 2.5305
Epoch [   28/   50] | d_loss: 0.3944 | g_loss: 3.4422
Epoch [   28/   50] | d_loss: 0.4846 | g_loss: 3.9264
Epoch [   28/   50] | d_loss: 0.5705 | g_loss: 4.1981
Epoch [   28/   50] | d_loss: 0.5055 | g_loss: 3.3421
Epoch [   28/   50] | d_loss: 0.6139 | g_loss: 2.1514
Epoch [   28/   50] | d_loss: 0.4294 | g_loss: 4.4630
Epoch [   28/   50] | d_loss: 0.5454 | g_loss: 2.9018
Epoch [   28/   50] | d_loss: 0.7194 | g_loss: 1.9674
Epoch [   28/   50] | d_loss: 0.4006 | g_loss: 2.9865
Epoch [   28/   50] | d_loss: 0.7971 | g_loss: 2.2902
Epoch [   28/   50] | d_loss: 0.6362 | g_loss: 2.1091
Epoch [   28/   50] | d_loss: 0.4412 | g_loss: 4.1445
Epoch [   28/   50] | d_loss: 0.7118 | g_loss: 2.4015
Epoch [   28/   50] | d_loss: 0.5109 | g_loss: 3.6084
Epoch [   28/   50] | d_loss: 0.5958 | g_loss: 2.4338
Epoch [   28/   50] | d_loss: 0.5735 | g_loss: 2.4553
Epoch [   28/   50] | d_loss: 0.5157 | g_loss: 2.5824
Epoch [   28/   50] | d_loss: 0.5108 | g_loss: 2.1152
Epoch [   28/   50] | d_loss: 0.4964 | g_loss: 3.8030
Epoch [   28/   50] | d_loss: 0.5073 | g_loss: 2.7634
Epoch [   28/   50] | d_loss: 0.7165 | g_loss: 1.7496
Epoch [   28/   50] | d_loss: 0.6541 | g_loss: 1.4683
Epoch [   28/   50] | d_loss: 0.4224 | g_loss: 4.3331
Epoch [   28/   50] | d_loss: 0.4323 | g_loss: 4.0094
Epoch [   28/   50] | d_loss: 0.5196 | g_loss: 2.9444
Epoch [   28/   50] | d_loss: 0.6220 | g_loss: 2.9369
Epoch [   28/   50] | d_loss: 0.4224 | g_loss: 3.2087
Epoch [   28/   50] | d_loss: 0.7498 | g_loss: 4.2331
Epoch [   28/   50] | d_loss: 0.4312 | g_loss: 2.7767
Epoch [   28/   50] | d_loss: 0.4699 | g_loss: 2.4768
Epoch [   29/   50] | d_loss: 0.3803 | g_loss: 2.8994
Epoch [   29/   50] | d_loss: 0.4975 | g_loss: 2.6392
Epoch [   29/   50] | d_loss: 0.4798 | g_loss: 2.5436
Epoch [   29/   50] | d_loss: 0.6195 | g_loss: 3.5875
Epoch [   29/   50] | d_loss: 0.4404 | g_loss: 3.4723
Epoch [   29/   50] | d_loss: 0.3927 | g_loss: 3.7915
Epoch [   29/   50] | d_loss: 0.5515 | g_loss: 1.2866
Epoch [   29/   50] | d_loss: 0.6737 | g_loss: 3.3041
Epoch [   29/   50] | d_loss: 0.4255 | g_loss: 3.5079
Epoch [   29/   50] | d_loss: 0.5659 | g_loss: 2.7278
Epoch [   29/   50] | d_loss: 0.5318 | g_loss: 3.2360
Epoch [   29/   50] | d_loss: 0.4762 | g_loss: 2.1868
Epoch [   29/   50] | d_loss: 0.6406 | g_loss: 3.0832
Epoch [   29/   50] | d_loss: 0.3845 | g_loss: 3.0003
Epoch [   29/   50] | d_loss: 1.0609 | g_loss: 1.3995
Epoch [   29/   50] | d_loss: 0.4456 | g_loss: 3.1885
Epoch [   29/   50] | d_loss: 0.5803 | g_loss: 1.5996
Epoch [   29/   50] | d_loss: 0.9963 | g_loss: 3.9128
Epoch [   29/   50] | d_loss: 0.6509 | g_loss: 2.5691
Epoch [   29/   50] | d_loss: 0.4076 | g_loss: 3.0509
Epoch [   29/   50] | d_loss: 0.5245 | g_loss: 3.4496
Epoch [   29/   50] | d_loss: 0.5287 | g_loss: 3.2577
Epoch [   29/   50] | d_loss: 0.5555 | g_loss: 2.6042
Epoch [   29/   50] | d_loss: 0.4777 | g_loss: 2.8599
Epoch [   29/   50] | d_loss: 0.4273 | g_loss: 3.4166
Epoch [   29/   50] | d_loss: 0.6479 | g_loss: 2.2174
Epoch [   29/   50] | d_loss: 0.4252 | g_loss: 2.4236
Epoch [   29/   50] | d_loss: 0.4535 | g_loss: 3.1185
Epoch [   29/   50] | d_loss: 0.4504 | g_loss: 3.8993
Epoch [   29/   50] | d_loss: 0.4421 | g_loss: 3.2458
Epoch [   29/   50] | d_loss: 0.5180 | g_loss: 3.5841
Epoch [   29/   50] | d_loss: 0.6730 | g_loss: 4.2066
Epoch [   29/   50] | d_loss: 0.9210 | g_loss: 3.9255
Epoch [   29/   50] | d_loss: 0.4282 | g_loss: 1.4104
Epoch [   29/   50] | d_loss: 0.3932 | g_loss: 3.4164
Epoch [   29/   50] | d_loss: 0.4028 | g_loss: 3.2671
Epoch [   30/   50] | d_loss: 0.8239 | g_loss: 2.5150
Epoch [   30/   50] | d_loss: 0.5617 | g_loss: 3.0388
Epoch [   30/   50] | d_loss: 0.4457 | g_loss: 3.3581
Epoch [   30/   50] | d_loss: 0.4986 | g_loss: 2.7450
Epoch [   30/   50] | d_loss: 0.5071 | g_loss: 3.8226
Epoch [   30/   50] | d_loss: 0.4350 | g_loss: 4.2533
Epoch [   30/   50] | d_loss: 0.4508 | g_loss: 2.5170
Epoch [   30/   50] | d_loss: 0.5059 | g_loss: 3.0342
Epoch [   30/   50] | d_loss: 0.4651 | g_loss: 3.3737
Epoch [   30/   50] | d_loss: 0.4679 | g_loss: 3.0974
Epoch [   30/   50] | d_loss: 0.4835 | g_loss: 2.8233
Epoch [   30/   50] | d_loss: 0.8207 | g_loss: 3.7800
Epoch [   30/   50] | d_loss: 0.7087 | g_loss: 3.2387
Epoch [   30/   50] | d_loss: 0.5697 | g_loss: 3.4159
Epoch [   30/   50] | d_loss: 0.4797 | g_loss: 3.8466
Epoch [   30/   50] | d_loss: 0.8022 | g_loss: 1.7182
Epoch [   30/   50] | d_loss: 0.5200 | g_loss: 2.8149
Epoch [   30/   50] | d_loss: 0.5387 | g_loss: 3.0576
Epoch [   30/   50] | d_loss: 0.3918 | g_loss: 4.0748
Epoch [   30/   50] | d_loss: 0.5786 | g_loss: 2.7456
Epoch [   30/   50] | d_loss: 0.6756 | g_loss: 3.8541
Epoch [   30/   50] | d_loss: 0.5137 | g_loss: 3.7484
Epoch [   30/   50] | d_loss: 0.5581 | g_loss: 4.5830
Epoch [   30/   50] | d_loss: 0.5586 | g_loss: 2.6747
Epoch [   30/   50] | d_loss: 0.5994 | g_loss: 2.4580
Epoch [   30/   50] | d_loss: 0.4199 | g_loss: 3.1421
Epoch [   30/   50] | d_loss: 0.4063 | g_loss: 2.6461
Epoch [   30/   50] | d_loss: 0.3847 | g_loss: 2.8816
Epoch [   30/   50] | d_loss: 0.5134 | g_loss: 2.2894
Epoch [   30/   50] | d_loss: 0.4378 | g_loss: 3.8453
Epoch [   30/   50] | d_loss: 0.4731 | g_loss: 4.0420
Epoch [   30/   50] | d_loss: 0.7744 | g_loss: 2.2242
Epoch [   30/   50] | d_loss: 0.5733 | g_loss: 2.2280
Epoch [   30/   50] | d_loss: 0.6325 | g_loss: 2.1750
Epoch [   30/   50] | d_loss: 0.4731 | g_loss: 1.7139
Epoch [   30/   50] | d_loss: 0.7170 | g_loss: 1.6215
Epoch [   31/   50] | d_loss: 0.4202 | g_loss: 3.7883
Epoch [   31/   50] | d_loss: 0.6340 | g_loss: 3.3971
Epoch [   31/   50] | d_loss: 0.4423 | g_loss: 2.7798
Epoch [   31/   50] | d_loss: 0.6095 | g_loss: 2.6355
Epoch [   31/   50] | d_loss: 0.5048 | g_loss: 3.1375
Epoch [   31/   50] | d_loss: 0.4893 | g_loss: 3.0275
Epoch [   31/   50] | d_loss: 0.3832 | g_loss: 2.4718
Epoch [   31/   50] | d_loss: 0.4337 | g_loss: 3.4839
Epoch [   31/   50] | d_loss: 0.4339 | g_loss: 2.8009
Epoch [   31/   50] | d_loss: 0.7952 | g_loss: 1.6968
Epoch [   31/   50] | d_loss: 0.4500 | g_loss: 3.8232
Epoch [   31/   50] | d_loss: 0.5008 | g_loss: 2.9553
Epoch [   31/   50] | d_loss: 0.4900 | g_loss: 2.7766
Epoch [   31/   50] | d_loss: 1.0041 | g_loss: 2.6596
Epoch [   31/   50] | d_loss: 0.4267 | g_loss: 3.6195
Epoch [   31/   50] | d_loss: 0.4624 | g_loss: 3.5352
Epoch [   31/   50] | d_loss: 0.5507 | g_loss: 4.2862
Epoch [   31/   50] | d_loss: 0.4412 | g_loss: 3.6000
Epoch [   31/   50] | d_loss: 0.4890 | g_loss: 2.6838
Epoch [   31/   50] | d_loss: 0.8137 | g_loss: 2.3967
Epoch [   31/   50] | d_loss: 0.5227 | g_loss: 1.8927
Epoch [   31/   50] | d_loss: 0.4510 | g_loss: 3.1600
Epoch [   31/   50] | d_loss: 0.5964 | g_loss: 2.6533
Epoch [   31/   50] | d_loss: 0.7554 | g_loss: 1.9844
Epoch [   31/   50] | d_loss: 0.8063 | g_loss: 2.1186
Epoch [   31/   50] | d_loss: 0.4556 | g_loss: 3.9651
Epoch [   31/   50] | d_loss: 0.6015 | g_loss: 3.2409
Epoch [   31/   50] | d_loss: 0.5841 | g_loss: 2.6658
Epoch [   31/   50] | d_loss: 0.4116 | g_loss: 2.6576
Epoch [   31/   50] | d_loss: 0.3945 | g_loss: 2.5915
Epoch [   31/   50] | d_loss: 0.4502 | g_loss: 4.7019
Epoch [   31/   50] | d_loss: 0.4746 | g_loss: 3.2469
Epoch [   31/   50] | d_loss: 0.9639 | g_loss: 2.0304
Epoch [   31/   50] | d_loss: 0.4865 | g_loss: 2.8969
Epoch [   31/   50] | d_loss: 0.5105 | g_loss: 2.5656
Epoch [   31/   50] | d_loss: 0.4019 | g_loss: 2.8148
Epoch [   32/   50] | d_loss: 0.4306 | g_loss: 3.3814
Epoch [   32/   50] | d_loss: 0.4735 | g_loss: 3.4490
Epoch [   32/   50] | d_loss: 0.4319 | g_loss: 4.0541
Epoch [   32/   50] | d_loss: 0.5377 | g_loss: 3.4269
Epoch [   32/   50] | d_loss: 0.8781 | g_loss: 2.5755
Epoch [   32/   50] | d_loss: 0.4071 | g_loss: 3.7515
Epoch [   32/   50] | d_loss: 0.4656 | g_loss: 3.3593
Epoch [   32/   50] | d_loss: 0.4678 | g_loss: 3.1531
Epoch [   32/   50] | d_loss: 0.7137 | g_loss: 2.9736
Epoch [   32/   50] | d_loss: 0.4933 | g_loss: 2.6586
Epoch [   32/   50] | d_loss: 0.6542 | g_loss: 3.7553
Epoch [   32/   50] | d_loss: 0.4479 | g_loss: 3.9180
Epoch [   32/   50] | d_loss: 0.5370 | g_loss: 3.7365
Epoch [   32/   50] | d_loss: 0.3844 | g_loss: 4.0448
Epoch [   32/   50] | d_loss: 0.4737 | g_loss: 2.9733
Epoch [   32/   50] | d_loss: 0.5551 | g_loss: 3.3787
Epoch [   32/   50] | d_loss: 0.6501 | g_loss: 2.3775
Epoch [   32/   50] | d_loss: 0.6792 | g_loss: 2.3214
Epoch [   32/   50] | d_loss: 0.4882 | g_loss: 2.5376
Epoch [   32/   50] | d_loss: 0.5401 | g_loss: 2.7179
Epoch [   32/   50] | d_loss: 0.4916 | g_loss: 2.4390
Epoch [   32/   50] | d_loss: 0.6777 | g_loss: 1.7927
Epoch [   32/   50] | d_loss: 0.4816 | g_loss: 4.2487
Epoch [   32/   50] | d_loss: 0.4796 | g_loss: 3.5804
Epoch [   32/   50] | d_loss: 0.5570 | g_loss: 2.6429
Epoch [   32/   50] | d_loss: 0.4023 | g_loss: 3.4992
Epoch [   32/   50] | d_loss: 0.5400 | g_loss: 3.8945
Epoch [   32/   50] | d_loss: 0.4882 | g_loss: 2.7535
Epoch [   32/   50] | d_loss: 0.5383 | g_loss: 3.9038
Epoch [   32/   50] | d_loss: 0.7891 | g_loss: 3.4024
Epoch [   32/   50] | d_loss: 0.4394 | g_loss: 3.5736
Epoch [   32/   50] | d_loss: 0.6736 | g_loss: 3.0717
Epoch [   32/   50] | d_loss: 0.4071 | g_loss: 4.0256
Epoch [   32/   50] | d_loss: 0.7624 | g_loss: 2.7173
Epoch [   32/   50] | d_loss: 0.5026 | g_loss: 3.9331
Epoch [   32/   50] | d_loss: 0.6328 | g_loss: 3.2215
Epoch [   33/   50] | d_loss: 0.7289 | g_loss: 4.2085
Epoch [   33/   50] | d_loss: 0.4790 | g_loss: 2.9040
Epoch [   33/   50] | d_loss: 0.4628 | g_loss: 4.5817
Epoch [   33/   50] | d_loss: 0.5808 | g_loss: 2.7728
Epoch [   33/   50] | d_loss: 0.7400 | g_loss: 4.0215
Epoch [   33/   50] | d_loss: 0.6106 | g_loss: 3.4539
Epoch [   33/   50] | d_loss: 0.5276 | g_loss: 2.3363
Epoch [   33/   50] | d_loss: 0.5232 | g_loss: 3.0515
Epoch [   33/   50] | d_loss: 0.4604 | g_loss: 2.9719
Epoch [   33/   50] | d_loss: 0.4692 | g_loss: 2.6167
Epoch [   33/   50] | d_loss: 0.6532 | g_loss: 2.9060
Epoch [   33/   50] | d_loss: 0.5205 | g_loss: 1.5646
Epoch [   33/   50] | d_loss: 0.4282 | g_loss: 4.4065
Epoch [   33/   50] | d_loss: 0.4885 | g_loss: 2.9910
Epoch [   33/   50] | d_loss: 0.4096 | g_loss: 3.4689
Epoch [   33/   50] | d_loss: 0.4253 | g_loss: 2.3995
Epoch [   33/   50] | d_loss: 0.7039 | g_loss: 3.8816
Epoch [   33/   50] | d_loss: 0.4502 | g_loss: 4.1850
Epoch [   33/   50] | d_loss: 0.4285 | g_loss: 4.0733
Epoch [   33/   50] | d_loss: 0.4759 | g_loss: 3.1721
Epoch [   33/   50] | d_loss: 0.5418 | g_loss: 1.7068
Epoch [   33/   50] | d_loss: 0.6111 | g_loss: 2.5734
Epoch [   33/   50] | d_loss: 0.4568 | g_loss: 4.1572
Epoch [   33/   50] | d_loss: 0.5718 | g_loss: 3.0938
Epoch [   33/   50] | d_loss: 0.4737 | g_loss: 2.3888
Epoch [   33/   50] | d_loss: 0.5002 | g_loss: 3.0244
Epoch [   33/   50] | d_loss: 0.4065 | g_loss: 3.4426
Epoch [   33/   50] | d_loss: 0.5206 | g_loss: 4.5145
Epoch [   33/   50] | d_loss: 0.5079 | g_loss: 2.5464
Epoch [   33/   50] | d_loss: 0.4337 | g_loss: 2.6132
Epoch [   33/   50] | d_loss: 0.4295 | g_loss: 3.5338
Epoch [   33/   50] | d_loss: 0.4793 | g_loss: 2.1707
Epoch [   33/   50] | d_loss: 0.4185 | g_loss: 3.1339
Epoch [   33/   50] | d_loss: 0.5463 | g_loss: 3.2277
Epoch [   33/   50] | d_loss: 0.4324 | g_loss: 3.9089
Epoch [   33/   50] | d_loss: 0.5004 | g_loss: 3.9007
Epoch [   34/   50] | d_loss: 0.5077 | g_loss: 2.5377
Epoch [   34/   50] | d_loss: 0.4482 | g_loss: 3.3178
Epoch [   34/   50] | d_loss: 0.5065 | g_loss: 3.2839
Epoch [   34/   50] | d_loss: 0.5803 | g_loss: 2.9077
Epoch [   34/   50] | d_loss: 0.4300 | g_loss: 2.6988
Epoch [   34/   50] | d_loss: 0.4202 | g_loss: 2.4126
Epoch [   34/   50] | d_loss: 0.4516 | g_loss: 4.7636
Epoch [   34/   50] | d_loss: 0.4495 | g_loss: 2.1546
Epoch [   34/   50] | d_loss: 0.5206 | g_loss: 1.8975
Epoch [   34/   50] | d_loss: 0.4968 | g_loss: 3.8214
Epoch [   34/   50] | d_loss: 0.3995 | g_loss: 4.2478
Epoch [   34/   50] | d_loss: 0.4534 | g_loss: 3.3694
Epoch [   34/   50] | d_loss: 0.6129 | g_loss: 2.6939
Epoch [   34/   50] | d_loss: 0.5726 | g_loss: 3.9438
Epoch [   34/   50] | d_loss: 0.4071 | g_loss: 2.8532
Epoch [   34/   50] | d_loss: 0.7625 | g_loss: 3.8166
Epoch [   34/   50] | d_loss: 0.7033 | g_loss: 1.9515
Epoch [   34/   50] | d_loss: 0.4810 | g_loss: 3.3917
Epoch [   34/   50] | d_loss: 0.4714 | g_loss: 2.6919
Epoch [   34/   50] | d_loss: 0.4065 | g_loss: 4.9589
Epoch [   34/   50] | d_loss: 0.6005 | g_loss: 2.1178
Epoch [   34/   50] | d_loss: 0.4823 | g_loss: 4.3270
Epoch [   34/   50] | d_loss: 0.4286 | g_loss: 3.2618
Epoch [   34/   50] | d_loss: 0.4782 | g_loss: 2.2494
Epoch [   34/   50] | d_loss: 0.5903 | g_loss: 2.5553
Epoch [   34/   50] | d_loss: 0.4836 | g_loss: 2.3313
Epoch [   34/   50] | d_loss: 0.5961 | g_loss: 1.7820
Epoch [   34/   50] | d_loss: 0.4473 | g_loss: 4.8108
Epoch [   34/   50] | d_loss: 0.3884 | g_loss: 3.5445
Epoch [   34/   50] | d_loss: 0.5038 | g_loss: 3.3708
Epoch [   34/   50] | d_loss: 0.5801 | g_loss: 3.0170
Epoch [   34/   50] | d_loss: 0.6466 | g_loss: 3.6176
Epoch [   34/   50] | d_loss: 0.4464 | g_loss: 3.0613
Epoch [   34/   50] | d_loss: 0.4304 | g_loss: 3.9365
Epoch [   34/   50] | d_loss: 0.5758 | g_loss: 2.4019
Epoch [   34/   50] | d_loss: 0.4653 | g_loss: 3.3123
Epoch [   35/   50] | d_loss: 0.9074 | g_loss: 3.1773
Epoch [   35/   50] | d_loss: 0.4594 | g_loss: 3.1975
Epoch [   35/   50] | d_loss: 0.5494 | g_loss: 3.0225
Epoch [   35/   50] | d_loss: 0.4089 | g_loss: 3.3423
Epoch [   35/   50] | d_loss: 0.4433 | g_loss: 3.1047
Epoch [   35/   50] | d_loss: 0.5855 | g_loss: 3.1410
Epoch [   35/   50] | d_loss: 0.5454 | g_loss: 2.8233
Epoch [   35/   50] | d_loss: 0.5280 | g_loss: 2.7012
Epoch [   35/   50] | d_loss: 0.4439 | g_loss: 3.2066
Epoch [   35/   50] | d_loss: 0.5289 | g_loss: 2.9619
Epoch [   35/   50] | d_loss: 0.4066 | g_loss: 3.1496
Epoch [   35/   50] | d_loss: 0.4210 | g_loss: 3.7237
Epoch [   35/   50] | d_loss: 0.5735 | g_loss: 1.6926
Epoch [   35/   50] | d_loss: 0.7058 | g_loss: 4.4413
Epoch [   35/   50] | d_loss: 0.4341 | g_loss: 4.1973
Epoch [   35/   50] | d_loss: 0.4540 | g_loss: 1.9531
Epoch [   35/   50] | d_loss: 0.5299 | g_loss: 3.6533
Epoch [   35/   50] | d_loss: 0.4403 | g_loss: 3.4261
Epoch [   35/   50] | d_loss: 0.4485 | g_loss: 3.5061
Epoch [   35/   50] | d_loss: 0.7331 | g_loss: 2.3035
Epoch [   35/   50] | d_loss: 0.4433 | g_loss: 3.6206
Epoch [   35/   50] | d_loss: 0.7124 | g_loss: 2.2198
Epoch [   35/   50] | d_loss: 0.6327 | g_loss: 2.1285
Epoch [   35/   50] | d_loss: 0.6425 | g_loss: 2.9181
Epoch [   35/   50] | d_loss: 0.4769 | g_loss: 3.4915
Epoch [   35/   50] | d_loss: 0.4750 | g_loss: 3.5646
Epoch [   35/   50] | d_loss: 0.4042 | g_loss: 3.2717
Epoch [   35/   50] | d_loss: 0.7249 | g_loss: 2.9220
Epoch [   35/   50] | d_loss: 0.5340 | g_loss: 3.5373
Epoch [   35/   50] | d_loss: 0.4524 | g_loss: 2.4686
Epoch [   35/   50] | d_loss: 0.5754 | g_loss: 2.5690
Epoch [   35/   50] | d_loss: 0.4252 | g_loss: 3.2786
Epoch [   35/   50] | d_loss: 0.5212 | g_loss: 3.2282
Epoch [   35/   50] | d_loss: 0.3797 | g_loss: 3.6562
Epoch [   35/   50] | d_loss: 0.4200 | g_loss: 3.4729
Epoch [   35/   50] | d_loss: 0.7885 | g_loss: 3.4849
Epoch [   36/   50] | d_loss: 0.4476 | g_loss: 3.5274
Epoch [   36/   50] | d_loss: 0.6030 | g_loss: 3.2271
Epoch [   36/   50] | d_loss: 0.4335 | g_loss: 2.5435
Epoch [   36/   50] | d_loss: 0.4871 | g_loss: 3.2742
Epoch [   36/   50] | d_loss: 0.6865 | g_loss: 3.1582
Epoch [   36/   50] | d_loss: 0.3923 | g_loss: 3.5382
Epoch [   36/   50] | d_loss: 0.5311 | g_loss: 3.7516
Epoch [   36/   50] | d_loss: 0.6309 | g_loss: 2.1587
Epoch [   36/   50] | d_loss: 0.7612 | g_loss: 1.5367
Epoch [   36/   50] | d_loss: 0.6515 | g_loss: 1.5910
Epoch [   36/   50] | d_loss: 0.4637 | g_loss: 3.2920
Epoch [   36/   50] | d_loss: 0.4149 | g_loss: 4.1423
Epoch [   36/   50] | d_loss: 0.5206 | g_loss: 2.7258
Epoch [   36/   50] | d_loss: 0.4277 | g_loss: 4.3861
Epoch [   36/   50] | d_loss: 0.9719 | g_loss: 1.9359
Epoch [   36/   50] | d_loss: 0.4091 | g_loss: 3.7460
Epoch [   36/   50] | d_loss: 0.4774 | g_loss: 3.5775
Epoch [   36/   50] | d_loss: 0.4972 | g_loss: 3.3349
Epoch [   36/   50] | d_loss: 0.4255 | g_loss: 3.5236
Epoch [   36/   50] | d_loss: 0.6025 | g_loss: 3.2706
Epoch [   36/   50] | d_loss: 0.4301 | g_loss: 2.8311
Epoch [   36/   50] | d_loss: 0.4463 | g_loss: 2.5931
Epoch [   36/   50] | d_loss: 0.8005 | g_loss: 3.7314
Epoch [   36/   50] | d_loss: 0.4015 | g_loss: 4.3954
Epoch [   36/   50] | d_loss: 0.4450 | g_loss: 3.4059
Epoch [   36/   50] | d_loss: 0.6978 | g_loss: 2.7992
Epoch [   36/   50] | d_loss: 0.6578 | g_loss: 2.8197
Epoch [   36/   50] | d_loss: 0.5274 | g_loss: 3.3350
Epoch [   36/   50] | d_loss: 0.4317 | g_loss: 3.3738
Epoch [   36/   50] | d_loss: 0.4203 | g_loss: 4.8123
Epoch [   36/   50] | d_loss: 0.5016 | g_loss: 2.2696
Epoch [   36/   50] | d_loss: 0.4676 | g_loss: 3.0954
Epoch [   36/   50] | d_loss: 0.4114 | g_loss: 4.3770
Epoch [   36/   50] | d_loss: 0.6999 | g_loss: 2.6631
Epoch [   36/   50] | d_loss: 0.4228 | g_loss: 3.5756
Epoch [   36/   50] | d_loss: 0.5476 | g_loss: 2.8987
Epoch [   37/   50] | d_loss: 0.5343 | g_loss: 1.7783
Epoch [   37/   50] | d_loss: 0.4483 | g_loss: 3.5807
Epoch [   37/   50] | d_loss: 0.4216 | g_loss: 2.8709
Epoch [   37/   50] | d_loss: 0.4857 | g_loss: 2.6150
Epoch [   37/   50] | d_loss: 0.8309 | g_loss: 2.3462
Epoch [   37/   50] | d_loss: 0.3893 | g_loss: 3.5658
Epoch [   37/   50] | d_loss: 0.5322 | g_loss: 4.1732
Epoch [   37/   50] | d_loss: 0.4113 | g_loss: 3.6281
Epoch [   37/   50] | d_loss: 0.5831 | g_loss: 2.1095
Epoch [   37/   50] | d_loss: 0.4114 | g_loss: 3.5569
Epoch [   37/   50] | d_loss: 0.9048 | g_loss: 2.4657
Epoch [   37/   50] | d_loss: 0.5029 | g_loss: 2.9583
Epoch [   37/   50] | d_loss: 0.4161 | g_loss: 4.1110
Epoch [   37/   50] | d_loss: 0.5493 | g_loss: 3.8992
Epoch [   37/   50] | d_loss: 0.7474 | g_loss: 3.5470
Epoch [   37/   50] | d_loss: 0.6740 | g_loss: 2.0510
Epoch [   37/   50] | d_loss: 0.9953 | g_loss: 2.3083
Epoch [   37/   50] | d_loss: 0.5689 | g_loss: 3.5188
Epoch [   37/   50] | d_loss: 0.5328 | g_loss: 2.2911
Epoch [   37/   50] | d_loss: 0.4517 | g_loss: 2.4738
Epoch [   37/   50] | d_loss: 0.4682 | g_loss: 1.5903
Epoch [   37/   50] | d_loss: 0.4322 | g_loss: 5.3950
Epoch [   37/   50] | d_loss: 0.4632 | g_loss: 3.1418
Epoch [   37/   50] | d_loss: 0.5045 | g_loss: 3.4628
Epoch [   37/   50] | d_loss: 0.4845 | g_loss: 3.3450
Epoch [   37/   50] | d_loss: 0.4772 | g_loss: 3.6355
Epoch [   37/   50] | d_loss: 0.5605 | g_loss: 0.7941
Epoch [   37/   50] | d_loss: 0.4559 | g_loss: 2.5893
Epoch [   37/   50] | d_loss: 0.5192 | g_loss: 1.9813
Epoch [   37/   50] | d_loss: 0.4076 | g_loss: 3.7489
Epoch [   37/   50] | d_loss: 0.5728 | g_loss: 5.5132
Epoch [   37/   50] | d_loss: 0.4031 | g_loss: 4.2591
Epoch [   37/   50] | d_loss: 0.5365 | g_loss: 2.5454
Epoch [   37/   50] | d_loss: 0.4741 | g_loss: 5.0378
Epoch [   37/   50] | d_loss: 0.5014 | g_loss: 3.2494
Epoch [   37/   50] | d_loss: 0.4744 | g_loss: 3.2536
Epoch [   38/   50] | d_loss: 0.4666 | g_loss: 3.5884
Epoch [   38/   50] | d_loss: 0.4389 | g_loss: 3.3199
Epoch [   38/   50] | d_loss: 0.4336 | g_loss: 3.8259
Epoch [   38/   50] | d_loss: 0.4913 | g_loss: 4.7784
Epoch [   38/   50] | d_loss: 0.4990 | g_loss: 2.7211
Epoch [   38/   50] | d_loss: 0.6203 | g_loss: 2.3778
Epoch [   38/   50] | d_loss: 0.4392 | g_loss: 4.1570
Epoch [   38/   50] | d_loss: 0.4163 | g_loss: 2.8137
Epoch [   38/   50] | d_loss: 0.6389 | g_loss: 3.5097
Epoch [   38/   50] | d_loss: 0.4090 | g_loss: 4.2849
Epoch [   38/   50] | d_loss: 0.5360 | g_loss: 2.5522
Epoch [   38/   50] | d_loss: 0.4807 | g_loss: 2.5044
Epoch [   38/   50] | d_loss: 0.6840 | g_loss: 2.9531
Epoch [   38/   50] | d_loss: 0.4715 | g_loss: 3.3570
Epoch [   38/   50] | d_loss: 0.4476 | g_loss: 3.7462
Epoch [   38/   50] | d_loss: 0.5447 | g_loss: 3.2854
Epoch [   38/   50] | d_loss: 0.5792 | g_loss: 3.4012
Epoch [   38/   50] | d_loss: 0.4634 | g_loss: 2.7208
Epoch [   38/   50] | d_loss: 0.5410 | g_loss: 3.3361
Epoch [   38/   50] | d_loss: 0.4609 | g_loss: 3.7178
Epoch [   38/   50] | d_loss: 0.4431 | g_loss: 3.8619
Epoch [   38/   50] | d_loss: 0.5839 | g_loss: 3.3369
Epoch [   38/   50] | d_loss: 0.5168 | g_loss: 5.0006
Epoch [   38/   50] | d_loss: 0.4178 | g_loss: 4.8905
Epoch [   38/   50] | d_loss: 0.3783 | g_loss: 2.5125
Epoch [   38/   50] | d_loss: 0.6061 | g_loss: 2.2288
Epoch [   38/   50] | d_loss: 0.5459 | g_loss: 3.3003
Epoch [   38/   50] | d_loss: 0.4684 | g_loss: 4.6564
Epoch [   38/   50] | d_loss: 0.6831 | g_loss: 3.0868
Epoch [   38/   50] | d_loss: 0.4175 | g_loss: 3.0779
Epoch [   38/   50] | d_loss: 0.8155 | g_loss: 2.5005
Epoch [   38/   50] | d_loss: 0.4363 | g_loss: 4.4276
Epoch [   38/   50] | d_loss: 0.3655 | g_loss: 3.0145
Epoch [   38/   50] | d_loss: 0.4308 | g_loss: 2.2968
Epoch [   38/   50] | d_loss: 0.4772 | g_loss: 3.4476
Epoch [   38/   50] | d_loss: 0.5596 | g_loss: 2.7286
Epoch [   39/   50] | d_loss: 0.7143 | g_loss: 3.3371
Epoch [   39/   50] | d_loss: 0.4415 | g_loss: 2.5290
Epoch [   39/   50] | d_loss: 0.4187 | g_loss: 4.4492
Epoch [   39/   50] | d_loss: 0.6247 | g_loss: 1.6228
Epoch [   39/   50] | d_loss: 0.4604 | g_loss: 4.4303
Epoch [   39/   50] | d_loss: 0.5274 | g_loss: 3.9786
Epoch [   39/   50] | d_loss: 0.4330 | g_loss: 4.7940
Epoch [   39/   50] | d_loss: 0.5343 | g_loss: 2.8691
Epoch [   39/   50] | d_loss: 0.3942 | g_loss: 4.0530
Epoch [   39/   50] | d_loss: 0.5285 | g_loss: 5.0677
Epoch [   39/   50] | d_loss: 0.9271 | g_loss: 3.4667
Epoch [   39/   50] | d_loss: 0.5321 | g_loss: 3.1078
Epoch [   39/   50] | d_loss: 0.5679 | g_loss: 3.7607
Epoch [   39/   50] | d_loss: 0.4572 | g_loss: 2.4709
Epoch [   39/   50] | d_loss: 0.4772 | g_loss: 3.5764
Epoch [   39/   50] | d_loss: 0.4827 | g_loss: 3.6247
Epoch [   39/   50] | d_loss: 0.3965 | g_loss: 4.7497
Epoch [   39/   50] | d_loss: 0.4448 | g_loss: 3.8639
Epoch [   39/   50] | d_loss: 0.4074 | g_loss: 5.2024
Epoch [   39/   50] | d_loss: 0.4856 | g_loss: 3.3454
Epoch [   39/   50] | d_loss: 0.4071 | g_loss: 3.1645
Epoch [   39/   50] | d_loss: 0.4601 | g_loss: 2.8876
Epoch [   39/   50] | d_loss: 0.3987 | g_loss: 3.1826
Epoch [   39/   50] | d_loss: 0.5147 | g_loss: 2.8340
Epoch [   39/   50] | d_loss: 0.5293 | g_loss: 3.1347
Epoch [   39/   50] | d_loss: 0.4964 | g_loss: 3.8519
Epoch [   39/   50] | d_loss: 0.4584 | g_loss: 4.3271
Epoch [   39/   50] | d_loss: 0.5394 | g_loss: 3.3932
Epoch [   39/   50] | d_loss: 0.4951 | g_loss: 4.5803
Epoch [   39/   50] | d_loss: 0.4687 | g_loss: 3.4422
Epoch [   39/   50] | d_loss: 0.3988 | g_loss: 3.4704
Epoch [   39/   50] | d_loss: 0.4398 | g_loss: 3.2592
Epoch [   39/   50] | d_loss: 0.5896 | g_loss: 2.1876
Epoch [   39/   50] | d_loss: 0.5162 | g_loss: 3.2714
Epoch [   39/   50] | d_loss: 0.4649 | g_loss: 3.7320
Epoch [   39/   50] | d_loss: 0.5227 | g_loss: 2.7113
Epoch [   40/   50] | d_loss: 0.5833 | g_loss: 3.2639
Epoch [   40/   50] | d_loss: 0.4201 | g_loss: 3.3844
Epoch [   40/   50] | d_loss: 0.4287 | g_loss: 3.7827
Epoch [   40/   50] | d_loss: 0.4495 | g_loss: 3.0355
Epoch [   40/   50] | d_loss: 0.5620 | g_loss: 3.2838
Epoch [   40/   50] | d_loss: 0.5231 | g_loss: 3.0783
Epoch [   40/   50] | d_loss: 0.6349 | g_loss: 3.1352
Epoch [   40/   50] | d_loss: 0.4816 | g_loss: 3.6237
Epoch [   40/   50] | d_loss: 1.1922 | g_loss: 2.0151
Epoch [   40/   50] | d_loss: 0.7054 | g_loss: 2.1906
Epoch [   40/   50] | d_loss: 0.4278 | g_loss: 4.0607
Epoch [   40/   50] | d_loss: 0.4822 | g_loss: 3.4119
Epoch [   40/   50] | d_loss: 0.5588 | g_loss: 2.2333
Epoch [   40/   50] | d_loss: 0.4905 | g_loss: 2.2933
Epoch [   40/   50] | d_loss: 0.4745 | g_loss: 2.3445
Epoch [   40/   50] | d_loss: 0.5044 | g_loss: 4.3530
Epoch [   40/   50] | d_loss: 0.4320 | g_loss: 3.4731
Epoch [   40/   50] | d_loss: 0.4241 | g_loss: 2.9452
Epoch [   40/   50] | d_loss: 0.5122 | g_loss: 3.2956
Epoch [   40/   50] | d_loss: 0.3873 | g_loss: 4.6039
Epoch [   40/   50] | d_loss: 0.5999 | g_loss: 3.5742
Epoch [   40/   50] | d_loss: 0.4832 | g_loss: 3.6142
Epoch [   40/   50] | d_loss: 0.4297 | g_loss: 3.1835
Epoch [   40/   50] | d_loss: 0.6338 | g_loss: 3.0653
Epoch [   40/   50] | d_loss: 0.5899 | g_loss: 2.9258
Epoch [   40/   50] | d_loss: 0.6376 | g_loss: 2.5262
Epoch [   40/   50] | d_loss: 0.3995 | g_loss: 4.6658
Epoch [   40/   50] | d_loss: 0.4880 | g_loss: 4.5186
Epoch [   40/   50] | d_loss: 0.4615 | g_loss: 3.2355
Epoch [   40/   50] | d_loss: 0.5811 | g_loss: 3.2160
Epoch [   40/   50] | d_loss: 0.5276 | g_loss: 3.0916
Epoch [   40/   50] | d_loss: 0.4989 | g_loss: 3.0659
Epoch [   40/   50] | d_loss: 0.7081 | g_loss: 4.1680
Epoch [   40/   50] | d_loss: 0.4688 | g_loss: 4.9261
Epoch [   40/   50] | d_loss: 0.8748 | g_loss: 0.8788
Epoch [   40/   50] | d_loss: 0.6082 | g_loss: 3.1166
Epoch [   41/   50] | d_loss: 0.5173 | g_loss: 2.8883
Epoch [   41/   50] | d_loss: 0.6190 | g_loss: 2.6024
Epoch [   41/   50] | d_loss: 0.4135 | g_loss: 3.0765
Epoch [   41/   50] | d_loss: 0.5896 | g_loss: 3.5769
Epoch [   41/   50] | d_loss: 0.5151 | g_loss: 2.3808
Epoch [   41/   50] | d_loss: 0.6669 | g_loss: 3.4008
Epoch [   41/   50] | d_loss: 0.4437 | g_loss: 3.4027
Epoch [   41/   50] | d_loss: 0.4675 | g_loss: 3.4803
Epoch [   41/   50] | d_loss: 0.4949 | g_loss: 3.5045
Epoch [   41/   50] | d_loss: 0.6967 | g_loss: 2.4879
Epoch [   41/   50] | d_loss: 0.3945 | g_loss: 3.6857
Epoch [   41/   50] | d_loss: 0.4431 | g_loss: 2.3537
Epoch [   41/   50] | d_loss: 0.4061 | g_loss: 4.0515
Epoch [   41/   50] | d_loss: 0.5317 | g_loss: 4.0707
Epoch [   41/   50] | d_loss: 0.4815 | g_loss: 3.2337
Epoch [   41/   50] | d_loss: 0.5706 | g_loss: 1.5683
Epoch [   41/   50] | d_loss: 0.6058 | g_loss: 2.9359
Epoch [   41/   50] | d_loss: 0.7301 | g_loss: 3.5690
Epoch [   41/   50] | d_loss: 0.4987 | g_loss: 2.4595
Epoch [   41/   50] | d_loss: 0.4480 | g_loss: 2.9303
Epoch [   41/   50] | d_loss: 0.4861 | g_loss: 3.3573
Epoch [   41/   50] | d_loss: 0.4434 | g_loss: 2.9586
Epoch [   41/   50] | d_loss: 0.6558 | g_loss: 2.7510
Epoch [   41/   50] | d_loss: 0.4490 | g_loss: 3.7207
Epoch [   41/   50] | d_loss: 0.3923 | g_loss: 4.2004
Epoch [   41/   50] | d_loss: 0.5274 | g_loss: 4.4808
Epoch [   41/   50] | d_loss: 0.4528 | g_loss: 3.8150
Epoch [   41/   50] | d_loss: 0.4942 | g_loss: 2.2054
Epoch [   41/   50] | d_loss: 0.5229 | g_loss: 2.6328
Epoch [   41/   50] | d_loss: 0.6511 | g_loss: 3.3975
Epoch [   41/   50] | d_loss: 0.4377 | g_loss: 3.0183
Epoch [   41/   50] | d_loss: 0.4720 | g_loss: 3.7669
Epoch [   41/   50] | d_loss: 0.5083 | g_loss: 3.1711
Epoch [   41/   50] | d_loss: 0.4161 | g_loss: 4.2403
Epoch [   41/   50] | d_loss: 0.4474 | g_loss: 1.7841
Epoch [   41/   50] | d_loss: 0.3975 | g_loss: 4.6247
Epoch [   42/   50] | d_loss: 0.4366 | g_loss: 2.7382
Epoch [   42/   50] | d_loss: 0.3781 | g_loss: 3.8128
Epoch [   42/   50] | d_loss: 0.4213 | g_loss: 4.4660
Epoch [   42/   50] | d_loss: 0.4619 | g_loss: 3.0202
Epoch [   42/   50] | d_loss: 0.4739 | g_loss: 4.9663
Epoch [   42/   50] | d_loss: 0.5032 | g_loss: 3.1816
Epoch [   42/   50] | d_loss: 0.5138 | g_loss: 3.8686
Epoch [   42/   50] | d_loss: 0.5246 | g_loss: 2.8085
Epoch [   42/   50] | d_loss: 0.4010 | g_loss: 3.5277
Epoch [   42/   50] | d_loss: 0.7756 | g_loss: 3.2417
Epoch [   42/   50] | d_loss: 0.3728 | g_loss: 3.4773
Epoch [   42/   50] | d_loss: 0.3844 | g_loss: 3.5954
Epoch [   42/   50] | d_loss: 0.6071 | g_loss: 2.5744
Epoch [   42/   50] | d_loss: 0.3956 | g_loss: 2.8667
Epoch [   42/   50] | d_loss: 0.4524 | g_loss: 4.3380
Epoch [   42/   50] | d_loss: 0.4990 | g_loss: 3.2851
Epoch [   42/   50] | d_loss: 0.3937 | g_loss: 4.2441
Epoch [   42/   50] | d_loss: 0.4431 | g_loss: 3.6272
Epoch [   42/   50] | d_loss: 0.6876 | g_loss: 4.3288
Epoch [   42/   50] | d_loss: 0.4287 | g_loss: 3.1300
Epoch [   42/   50] | d_loss: 0.7512 | g_loss: 3.5558
Epoch [   42/   50] | d_loss: 1.0123 | g_loss: 4.7940
Epoch [   42/   50] | d_loss: 0.4494 | g_loss: 3.4772
Epoch [   42/   50] | d_loss: 0.6609 | g_loss: 2.8283
Epoch [   42/   50] | d_loss: 0.5635 | g_loss: 2.4504
Epoch [   42/   50] | d_loss: 0.4925 | g_loss: 3.5756
Epoch [   42/   50] | d_loss: 0.6616 | g_loss: 4.6532
Epoch [   42/   50] | d_loss: 0.7057 | g_loss: 3.5495
Epoch [   42/   50] | d_loss: 0.4642 | g_loss: 2.4435
Epoch [   42/   50] | d_loss: 0.5059 | g_loss: 4.5791
Epoch [   42/   50] | d_loss: 0.3910 | g_loss: 4.0461
Epoch [   42/   50] | d_loss: 0.5958 | g_loss: 2.4162
Epoch [   42/   50] | d_loss: 0.5182 | g_loss: 2.3532
Epoch [   42/   50] | d_loss: 0.6081 | g_loss: 3.3314
Epoch [   42/   50] | d_loss: 0.4505 | g_loss: 2.9921
Epoch [   42/   50] | d_loss: 0.4716 | g_loss: 2.7017
Epoch [   43/   50] | d_loss: 0.6008 | g_loss: 3.0935
Epoch [   43/   50] | d_loss: 0.5904 | g_loss: 3.3170
Epoch [   43/   50] | d_loss: 0.6829 | g_loss: 4.2095
Epoch [   43/   50] | d_loss: 0.5345 | g_loss: 2.5481
Epoch [   43/   50] | d_loss: 0.4803 | g_loss: 3.5060
Epoch [   43/   50] | d_loss: 0.4761 | g_loss: 3.0492
Epoch [   43/   50] | d_loss: 0.6124 | g_loss: 2.3076
Epoch [   43/   50] | d_loss: 0.4886 | g_loss: 1.8953
Epoch [   43/   50] | d_loss: 0.3828 | g_loss: 3.8256
Epoch [   43/   50] | d_loss: 0.4704 | g_loss: 3.7670
Epoch [   43/   50] | d_loss: 0.4094 | g_loss: 3.3193
Epoch [   43/   50] | d_loss: 0.4063 | g_loss: 3.4875
Epoch [   43/   50] | d_loss: 0.3979 | g_loss: 2.6376
Epoch [   43/   50] | d_loss: 0.5902 | g_loss: 3.1461
Epoch [   43/   50] | d_loss: 0.4942 | g_loss: 4.0710
Epoch [   43/   50] | d_loss: 0.5440 | g_loss: 1.9273
Epoch [   43/   50] | d_loss: 0.4350 | g_loss: 4.0787
Epoch [   43/   50] | d_loss: 0.4015 | g_loss: 5.1019
Epoch [   43/   50] | d_loss: 0.4957 | g_loss: 2.7004
Epoch [   43/   50] | d_loss: 0.4105 | g_loss: 4.2131
Epoch [   43/   50] | d_loss: 0.4839 | g_loss: 3.8923
Epoch [   43/   50] | d_loss: 0.5120 | g_loss: 3.3511
Epoch [   43/   50] | d_loss: 0.4826 | g_loss: 3.1829
Epoch [   43/   50] | d_loss: 0.4401 | g_loss: 3.8347
Epoch [   43/   50] | d_loss: 0.5211 | g_loss: 4.7038
Epoch [   43/   50] | d_loss: 0.4156 | g_loss: 3.9709
Epoch [   43/   50] | d_loss: 0.5484 | g_loss: 2.7034
Epoch [   43/   50] | d_loss: 0.3862 | g_loss: 3.4176
Epoch [   43/   50] | d_loss: 0.3851 | g_loss: 2.0387
Epoch [   43/   50] | d_loss: 0.4096 | g_loss: 5.3175
Epoch [   43/   50] | d_loss: 0.4567 | g_loss: 2.8959
Epoch [   43/   50] | d_loss: 0.5305 | g_loss: 2.7425
Epoch [   43/   50] | d_loss: 0.4918 | g_loss: 3.0464
Epoch [   43/   50] | d_loss: 0.4436 | g_loss: 5.0126
Epoch [   43/   50] | d_loss: 0.3666 | g_loss: 3.7215
Epoch [   43/   50] | d_loss: 0.4586 | g_loss: 2.2873
Epoch [   44/   50] | d_loss: 0.4502 | g_loss: 4.8448
Epoch [   44/   50] | d_loss: 0.6019 | g_loss: 3.7511
Epoch [   44/   50] | d_loss: 0.4412 | g_loss: 4.1798
Epoch [   44/   50] | d_loss: 0.5213 | g_loss: 2.4278
Epoch [   44/   50] | d_loss: 0.5432 | g_loss: 2.8717
Epoch [   44/   50] | d_loss: 0.5915 | g_loss: 2.1925
Epoch [   44/   50] | d_loss: 0.3952 | g_loss: 3.5596
Epoch [   44/   50] | d_loss: 0.5137 | g_loss: 2.9752
Epoch [   44/   50] | d_loss: 0.3801 | g_loss: 3.2617
Epoch [   44/   50] | d_loss: 0.3934 | g_loss: 4.3189
Epoch [   44/   50] | d_loss: 0.6998 | g_loss: 2.0803
Epoch [   44/   50] | d_loss: 0.4438 | g_loss: 3.4269
Epoch [   44/   50] | d_loss: 0.5865 | g_loss: 2.7019
Epoch [   44/   50] | d_loss: 0.6949 | g_loss: 3.4754
Epoch [   44/   50] | d_loss: 0.4383 | g_loss: 4.0699
Epoch [   44/   50] | d_loss: 0.5882 | g_loss: 1.9643
Epoch [   44/   50] | d_loss: 0.5415 | g_loss: 4.1782
Epoch [   44/   50] | d_loss: 0.3849 | g_loss: 4.8639
Epoch [   44/   50] | d_loss: 0.4127 | g_loss: 4.0149
Epoch [   44/   50] | d_loss: 0.4688 | g_loss: 3.6471
Epoch [   44/   50] | d_loss: 0.4626 | g_loss: 3.9039
Epoch [   44/   50] | d_loss: 0.5575 | g_loss: 3.3541
Epoch [   44/   50] | d_loss: 0.4891 | g_loss: 2.6650
Epoch [   44/   50] | d_loss: 0.4884 | g_loss: 3.0841
Epoch [   44/   50] | d_loss: 0.6255 | g_loss: 4.2371
Epoch [   44/   50] | d_loss: 0.8272 | g_loss: 2.3477
Epoch [   44/   50] | d_loss: 0.4972 | g_loss: 2.2914
Epoch [   44/   50] | d_loss: 0.3844 | g_loss: 3.6712
Epoch [   44/   50] | d_loss: 0.6161 | g_loss: 2.7011
Epoch [   44/   50] | d_loss: 0.5046 | g_loss: 3.9540
Epoch [   44/   50] | d_loss: 0.6889 | g_loss: 1.8667
Epoch [   44/   50] | d_loss: 0.4774 | g_loss: 3.4369
Epoch [   44/   50] | d_loss: 0.5185 | g_loss: 4.1110
Epoch [   44/   50] | d_loss: 0.4536 | g_loss: 3.5288
Epoch [   44/   50] | d_loss: 0.4813 | g_loss: 2.3111
Epoch [   44/   50] | d_loss: 0.4860 | g_loss: 1.5475
Epoch [   45/   50] | d_loss: 1.0282 | g_loss: 3.9991
Epoch [   45/   50] | d_loss: 0.6143 | g_loss: 4.2192
Epoch [   45/   50] | d_loss: 0.4671 | g_loss: 2.5211
Epoch [   45/   50] | d_loss: 0.4324 | g_loss: 2.8649
Epoch [   45/   50] | d_loss: 0.4771 | g_loss: 2.0196
Epoch [   45/   50] | d_loss: 0.4460 | g_loss: 3.4052
Epoch [   45/   50] | d_loss: 0.5425 | g_loss: 4.8166
Epoch [   45/   50] | d_loss: 0.4131 | g_loss: 3.8352
Epoch [   45/   50] | d_loss: 0.4072 | g_loss: 3.4158
Epoch [   45/   50] | d_loss: 0.3949 | g_loss: 4.0922
Epoch [   45/   50] | d_loss: 0.6932 | g_loss: 4.2888
Epoch [   45/   50] | d_loss: 0.5517 | g_loss: 4.6707
Epoch [   45/   50] | d_loss: 0.7168 | g_loss: 4.0955
Epoch [   45/   50] | d_loss: 0.6058 | g_loss: 2.9931
Epoch [   45/   50] | d_loss: 0.4352 | g_loss: 3.0485
Epoch [   45/   50] | d_loss: 0.4832 | g_loss: 3.5664
Epoch [   45/   50] | d_loss: 0.5522 | g_loss: 1.9102
Epoch [   45/   50] | d_loss: 0.4548 | g_loss: 3.1949
Epoch [   45/   50] | d_loss: 0.5025 | g_loss: 4.5205
Epoch [   45/   50] | d_loss: 0.4076 | g_loss: 3.5346
Epoch [   45/   50] | d_loss: 0.4631 | g_loss: 4.4201
Epoch [   45/   50] | d_loss: 0.6839 | g_loss: 3.3925
Epoch [   45/   50] | d_loss: 0.4505 | g_loss: 2.3558
Epoch [   45/   50] | d_loss: 0.7647 | g_loss: 1.9907
Epoch [   45/   50] | d_loss: 0.4698 | g_loss: 2.7023
Epoch [   45/   50] | d_loss: 0.4837 | g_loss: 2.7316
Epoch [   45/   50] | d_loss: 0.6289 | g_loss: 3.1703
Epoch [   45/   50] | d_loss: 0.4762 | g_loss: 3.4267
Epoch [   45/   50] | d_loss: 0.5554 | g_loss: 5.2482
Epoch [   45/   50] | d_loss: 0.5651 | g_loss: 3.0730
Epoch [   45/   50] | d_loss: 0.5893 | g_loss: 4.4145
Epoch [   45/   50] | d_loss: 0.5163 | g_loss: 2.8962
Epoch [   45/   50] | d_loss: 0.4343 | g_loss: 4.8339
Epoch [   45/   50] | d_loss: 0.3856 | g_loss: 5.4979
Epoch [   45/   50] | d_loss: 0.6430 | g_loss: 4.1325
Epoch [   45/   50] | d_loss: 0.4401 | g_loss: 4.2756
Epoch [   46/   50] | d_loss: 0.4502 | g_loss: 2.6352
Epoch [   46/   50] | d_loss: 0.3679 | g_loss: 4.5901
Epoch [   46/   50] | d_loss: 0.4046 | g_loss: 3.1784
Epoch [   46/   50] | d_loss: 0.4073 | g_loss: 4.8393
Epoch [   46/   50] | d_loss: 0.4660 | g_loss: 3.9896
Epoch [   46/   50] | d_loss: 0.5021 | g_loss: 2.6290
Epoch [   46/   50] | d_loss: 0.4259 | g_loss: 4.3053
Epoch [   46/   50] | d_loss: 0.4343 | g_loss: 2.9197
Epoch [   46/   50] | d_loss: 0.4953 | g_loss: 2.6197
Epoch [   46/   50] | d_loss: 0.3965 | g_loss: 4.6349
Epoch [   46/   50] | d_loss: 0.4275 | g_loss: 3.8697
Epoch [   46/   50] | d_loss: 0.5682 | g_loss: 2.1374
Epoch [   46/   50] | d_loss: 0.3927 | g_loss: 5.1037
Epoch [   46/   50] | d_loss: 0.6974 | g_loss: 1.9144
Epoch [   46/   50] | d_loss: 0.4277 | g_loss: 3.8682
Epoch [   46/   50] | d_loss: 0.5716 | g_loss: 2.4378
Epoch [   46/   50] | d_loss: 0.4786 | g_loss: 2.9868
Epoch [   46/   50] | d_loss: 0.5738 | g_loss: 2.9970
Epoch [   46/   50] | d_loss: 0.4212 | g_loss: 2.7943
Epoch [   46/   50] | d_loss: 0.5512 | g_loss: 2.6334
Epoch [   46/   50] | d_loss: 0.4667 | g_loss: 4.3618
Epoch [   46/   50] | d_loss: 0.4319 | g_loss: 3.0614
Epoch [   46/   50] | d_loss: 1.0483 | g_loss: 2.1685
Epoch [   46/   50] | d_loss: 0.5441 | g_loss: 2.5077
Epoch [   46/   50] | d_loss: 0.5339 | g_loss: 1.8127
Epoch [   46/   50] | d_loss: 0.4563 | g_loss: 2.3452
Epoch [   46/   50] | d_loss: 0.5118 | g_loss: 4.7833
Epoch [   46/   50] | d_loss: 0.5592 | g_loss: 2.5786
Epoch [   46/   50] | d_loss: 0.4679 | g_loss: 3.3854
Epoch [   46/   50] | d_loss: 0.3857 | g_loss: 3.6521
Epoch [   46/   50] | d_loss: 0.5623 | g_loss: 3.6183
Epoch [   46/   50] | d_loss: 0.4831 | g_loss: 2.2800
Epoch [   46/   50] | d_loss: 0.6359 | g_loss: 2.2390
Epoch [   46/   50] | d_loss: 0.4178 | g_loss: 3.1498
Epoch [   46/   50] | d_loss: 0.4451 | g_loss: 3.3579
Epoch [   46/   50] | d_loss: 0.5878 | g_loss: 1.6087
Epoch [   47/   50] | d_loss: 0.4573 | g_loss: 3.0907
Epoch [   47/   50] | d_loss: 0.7036 | g_loss: 1.7195
Epoch [   47/   50] | d_loss: 0.5324 | g_loss: 3.1793
Epoch [   47/   50] | d_loss: 0.4046 | g_loss: 3.4093
Epoch [   47/   50] | d_loss: 0.4349 | g_loss: 4.1053
Epoch [   47/   50] | d_loss: 0.3886 | g_loss: 3.9621
Epoch [   47/   50] | d_loss: 0.4814 | g_loss: 2.4662
Epoch [   47/   50] | d_loss: 0.4565 | g_loss: 5.3355
Epoch [   47/   50] | d_loss: 0.3802 | g_loss: 4.4243
Epoch [   47/   50] | d_loss: 0.5944 | g_loss: 3.0246
Epoch [   47/   50] | d_loss: 0.4150 | g_loss: 4.2103
Epoch [   47/   50] | d_loss: 0.4801 | g_loss: 4.2263
Epoch [   47/   50] | d_loss: 0.4336 | g_loss: 1.5109
Epoch [   47/   50] | d_loss: 0.4484 | g_loss: 5.8960
Epoch [   47/   50] | d_loss: 0.5664 | g_loss: 3.8335
Epoch [   47/   50] | d_loss: 0.4003 | g_loss: 4.3340
Epoch [   47/   50] | d_loss: 0.4319 | g_loss: 4.0462
Epoch [   47/   50] | d_loss: 0.4402 | g_loss: 3.8093
Epoch [   47/   50] | d_loss: 0.6529 | g_loss: 3.1565
Epoch [   47/   50] | d_loss: 0.4885 | g_loss: 2.5521
Epoch [   47/   50] | d_loss: 0.3990 | g_loss: 1.6559
Epoch [   47/   50] | d_loss: 0.7471 | g_loss: 1.6704
Epoch [   47/   50] | d_loss: 0.4386 | g_loss: 3.3115
Epoch [   47/   50] | d_loss: 0.4150 | g_loss: 1.7804
Epoch [   47/   50] | d_loss: 0.4176 | g_loss: 3.0804
Epoch [   47/   50] | d_loss: 0.4941 | g_loss: 2.7889
Epoch [   47/   50] | d_loss: 0.5356 | g_loss: 2.7221
Epoch [   47/   50] | d_loss: 0.4088 | g_loss: 3.9616
Epoch [   47/   50] | d_loss: 0.3859 | g_loss: 3.0080
Epoch [   47/   50] | d_loss: 0.4144 | g_loss: 4.0332
Epoch [   47/   50] | d_loss: 0.4318 | g_loss: 4.4956
Epoch [   47/   50] | d_loss: 0.4587 | g_loss: 3.6926
Epoch [   47/   50] | d_loss: 0.5735 | g_loss: 3.4839
Epoch [   47/   50] | d_loss: 1.2522 | g_loss: 3.5037
Epoch [   47/   50] | d_loss: 0.5617 | g_loss: 3.2211
Epoch [   47/   50] | d_loss: 0.4352 | g_loss: 1.6426
Epoch [   48/   50] | d_loss: 0.5196 | g_loss: 2.8607
Epoch [   48/   50] | d_loss: 0.4945 | g_loss: 4.4156
Epoch [   48/   50] | d_loss: 0.4155 | g_loss: 3.6066
Epoch [   48/   50] | d_loss: 0.3837 | g_loss: 4.3049
Epoch [   48/   50] | d_loss: 0.4384 | g_loss: 4.3143
Epoch [   48/   50] | d_loss: 0.3752 | g_loss: 4.3856
Epoch [   48/   50] | d_loss: 0.5135 | g_loss: 3.7734
Epoch [   48/   50] | d_loss: 0.4672 | g_loss: 2.5157
Epoch [   48/   50] | d_loss: 0.5895 | g_loss: 2.5041
Epoch [   48/   50] | d_loss: 0.3766 | g_loss: 3.1223
Epoch [   48/   50] | d_loss: 0.5045 | g_loss: 3.4327
Epoch [   48/   50] | d_loss: 0.4156 | g_loss: 3.8701
Epoch [   48/   50] | d_loss: 0.4465 | g_loss: 4.8574
Epoch [   48/   50] | d_loss: 0.4488 | g_loss: 3.9449
Epoch [   48/   50] | d_loss: 0.4946 | g_loss: 2.6295
Epoch [   48/   50] | d_loss: 0.5695 | g_loss: 2.4656
Epoch [   48/   50] | d_loss: 0.3921 | g_loss: 5.1352
Epoch [   48/   50] | d_loss: 0.4112 | g_loss: 4.4327
Epoch [   48/   50] | d_loss: 0.4360 | g_loss: 3.2172
Epoch [   48/   50] | d_loss: 0.4025 | g_loss: 3.5486
Epoch [   48/   50] | d_loss: 0.5633 | g_loss: 1.9540
Epoch [   48/   50] | d_loss: 0.4364 | g_loss: 2.3749
Epoch [   48/   50] | d_loss: 0.4056 | g_loss: 5.0523
Epoch [   48/   50] | d_loss: 0.4359 | g_loss: 5.0013
Epoch [   48/   50] | d_loss: 0.3741 | g_loss: 2.9260
Epoch [   48/   50] | d_loss: 0.5644 | g_loss: 2.3383
Epoch [   48/   50] | d_loss: 0.4196 | g_loss: 4.2420
Epoch [   48/   50] | d_loss: 0.4612 | g_loss: 3.2149
Epoch [   48/   50] | d_loss: 0.4518 | g_loss: 3.4355
Epoch [   48/   50] | d_loss: 0.3900 | g_loss: 4.6724
Epoch [   48/   50] | d_loss: 0.7074 | g_loss: 3.5339
Epoch [   48/   50] | d_loss: 0.4864 | g_loss: 4.8141
Epoch [   48/   50] | d_loss: 0.4932 | g_loss: 4.1499
Epoch [   48/   50] | d_loss: 0.4666 | g_loss: 4.0893
Epoch [   48/   50] | d_loss: 0.4890 | g_loss: 3.1600
Epoch [   48/   50] | d_loss: 0.9370 | g_loss: 3.1530
Epoch [   49/   50] | d_loss: 0.4251 | g_loss: 3.6025
Epoch [   49/   50] | d_loss: 0.4317 | g_loss: 3.4343
Epoch [   49/   50] | d_loss: 0.5399 | g_loss: 4.3759
Epoch [   49/   50] | d_loss: 0.6157 | g_loss: 2.6446
Epoch [   49/   50] | d_loss: 0.4371 | g_loss: 3.7096
Epoch [   49/   50] | d_loss: 0.4268 | g_loss: 3.7967
Epoch [   49/   50] | d_loss: 0.4274 | g_loss: 4.2992
Epoch [   49/   50] | d_loss: 0.4189 | g_loss: 3.5507
Epoch [   49/   50] | d_loss: 0.4366 | g_loss: 2.9475
Epoch [   49/   50] | d_loss: 0.6935 | g_loss: 2.0187
Epoch [   49/   50] | d_loss: 0.4451 | g_loss: 3.0786
Epoch [   49/   50] | d_loss: 0.5313 | g_loss: 2.4356
Epoch [   49/   50] | d_loss: 0.3956 | g_loss: 4.9522
Epoch [   49/   50] | d_loss: 0.3767 | g_loss: 4.9646
Epoch [   49/   50] | d_loss: 0.4760 | g_loss: 2.7926
Epoch [   49/   50] | d_loss: 0.4563 | g_loss: 4.2090
Epoch [   49/   50] | d_loss: 0.4540 | g_loss: 3.9513
Epoch [   49/   50] | d_loss: 0.4726 | g_loss: 3.2269
Epoch [   49/   50] | d_loss: 0.6964 | g_loss: 3.5398
Epoch [   49/   50] | d_loss: 0.4768 | g_loss: 4.7441
Epoch [   49/   50] | d_loss: 0.4052 | g_loss: 3.9513
Epoch [   49/   50] | d_loss: 0.5216 | g_loss: 3.0106
Epoch [   49/   50] | d_loss: 0.5182 | g_loss: 4.7430
Epoch [   49/   50] | d_loss: 0.3756 | g_loss: 4.4134
Epoch [   49/   50] | d_loss: 0.4505 | g_loss: 2.9972
Epoch [   49/   50] | d_loss: 0.5714 | g_loss: 3.0431
Epoch [   49/   50] | d_loss: 0.4090 | g_loss: 3.3717
Epoch [   49/   50] | d_loss: 0.4853 | g_loss: 2.7006
Epoch [   49/   50] | d_loss: 0.9918 | g_loss: 1.7243
Epoch [   49/   50] | d_loss: 0.5209 | g_loss: 2.6700
Epoch [   49/   50] | d_loss: 0.4954 | g_loss: 2.5210
Epoch [   49/   50] | d_loss: 0.4006 | g_loss: 3.9036
Epoch [   49/   50] | d_loss: 0.5271 | g_loss: 3.3070
Epoch [   49/   50] | d_loss: 0.4106 | g_loss: 5.0905
Epoch [   49/   50] | d_loss: 1.0817 | g_loss: 3.4579
Epoch [   49/   50] | d_loss: 0.5197 | g_loss: 2.5496
Epoch [   50/   50] | d_loss: 0.4288 | g_loss: 5.7978
Epoch [   50/   50] | d_loss: 0.3933 | g_loss: 3.9245
Epoch [   50/   50] | d_loss: 0.4137 | g_loss: 3.4277
Epoch [   50/   50] | d_loss: 0.4815 | g_loss: 3.0896
Epoch [   50/   50] | d_loss: 0.6000 | g_loss: 3.2502
Epoch [   50/   50] | d_loss: 0.4339 | g_loss: 4.3824
Epoch [   50/   50] | d_loss: 0.3639 | g_loss: 3.4578
Epoch [   50/   50] | d_loss: 0.4441 | g_loss: 2.8119
Epoch [   50/   50] | d_loss: 0.5711 | g_loss: 2.9180
Epoch [   50/   50] | d_loss: 0.5397 | g_loss: 3.5742
Epoch [   50/   50] | d_loss: 0.4147 | g_loss: 3.1784
Epoch [   50/   50] | d_loss: 0.5850 | g_loss: 2.6294
Epoch [   50/   50] | d_loss: 0.5391 | g_loss: 3.0421
Epoch [   50/   50] | d_loss: 0.4208 | g_loss: 3.8863
Epoch [   50/   50] | d_loss: 0.6519 | g_loss: 3.1330
Epoch [   50/   50] | d_loss: 0.4024 | g_loss: 4.8439
Epoch [   50/   50] | d_loss: 0.3944 | g_loss: 3.8166
Epoch [   50/   50] | d_loss: 0.5838 | g_loss: 3.3596
Epoch [   50/   50] | d_loss: 0.5144 | g_loss: 2.4878
Epoch [   50/   50] | d_loss: 0.5819 | g_loss: 3.6458
Epoch [   50/   50] | d_loss: 0.4617 | g_loss: 3.1987
Epoch [   50/   50] | d_loss: 0.5539 | g_loss: 3.7420
Epoch [   50/   50] | d_loss: 0.4682 | g_loss: 3.5142
Epoch [   50/   50] | d_loss: 0.4635 | g_loss: 2.8622
Epoch [   50/   50] | d_loss: 0.4984 | g_loss: 3.6079
Epoch [   50/   50] | d_loss: 0.5443 | g_loss: 3.3634
Epoch [   50/   50] | d_loss: 0.7650 | g_loss: 2.8077
Epoch [   50/   50] | d_loss: 0.7286 | g_loss: 2.0304
Epoch [   50/   50] | d_loss: 0.4296 | g_loss: 3.3932
Epoch [   50/   50] | d_loss: 0.5464 | g_loss: 3.5350
Epoch [   50/   50] | d_loss: 1.1297 | g_loss: 2.3176
Epoch [   50/   50] | d_loss: 0.6372 | g_loss: 2.4985
Epoch [   50/   50] | d_loss: 0.5065 | g_loss: 3.7486
Epoch [   50/   50] | d_loss: 0.4616 | g_loss: 2.8810
Epoch [   50/   50] | d_loss: 0.3963 | g_loss: 4.0131
Epoch [   50/   50] | d_loss: 0.3863 | g_loss: 4.2155

Training loss

Plot the training losses for the generator and discriminator, recorded after each epoch.

In [26]:
fig, ax = plt.subplots()
losses = np.array(losses)
plt.plot(losses.T[0], label='Discriminator', alpha=0.5)
plt.plot(losses.T[1], label='Generator', alpha=0.5)
plt.title("Training Losses")
plt.legend()
Out[26]:
<matplotlib.legend.Legend at 0x7fd9920969e8>

Generator samples from training

View samples of images from the generator, and answer a question about the strengths and weaknesses of your trained models.

In [27]:
# helper function for viewing a list of passed in sample images
def view_samples(epoch, samples):
    fig, axes = plt.subplots(figsize=(16,4), nrows=2, ncols=8, sharey=True, sharex=True)
    for ax, img in zip(axes.flatten(), samples[epoch]):
        img = img.detach().cpu().numpy()
        img = np.transpose(img, (1, 2, 0))
        img = ((img + 1)*255 / (2)).astype(np.uint8)
        ax.xaxis.set_visible(False)
        ax.yaxis.set_visible(False)
        im = ax.imshow(img.reshape((32,32,3)))
In [28]:
# Load samples from generator, taken while training
with open('train_samples.pkl', 'rb') as f:
    samples = pkl.load(f)
In [29]:
_ = view_samples(-1, samples)
In [38]:
# Let's take a look at how the model was progressing over time
for cnt in range(0, n_epochs):
    print('===== Epoch#{} ====='.format(cnt))
    view_samples(cnt, samples)
    plt.draw()
    plt.pause(0.001)
    print()
===== Epoch#0 =====
===== Epoch#1 =====
===== Epoch#2 =====
===== Epoch#3 =====
===== Epoch#4 =====
===== Epoch#5 =====
===== Epoch#6 =====
===== Epoch#7 =====
===== Epoch#8 =====
===== Epoch#9 =====
===== Epoch#10 =====
===== Epoch#11 =====
===== Epoch#12 =====
===== Epoch#13 =====
===== Epoch#14 =====
===== Epoch#15 =====
===== Epoch#16 =====
===== Epoch#17 =====
===== Epoch#18 =====
===== Epoch#19 =====
===== Epoch#20 =====
===== Epoch#21 =====
===== Epoch#22 =====
===== Epoch#23 =====
===== Epoch#24 =====
===== Epoch#25 =====
===== Epoch#26 =====
===== Epoch#27 =====
===== Epoch#28 =====
===== Epoch#29 =====
===== Epoch#30 =====
===== Epoch#31 =====
===== Epoch#32 =====
===== Epoch#33 =====
===== Epoch#34 =====
===== Epoch#35 =====
===== Epoch#36 =====
===== Epoch#37 =====
===== Epoch#38 =====
===== Epoch#39 =====
===== Epoch#40 =====
===== Epoch#41 =====
===== Epoch#42 =====
===== Epoch#43 =====
===== Epoch#44 =====
===== Epoch#45 =====
===== Epoch#46 =====
===== Epoch#47 =====
===== Epoch#48 =====
===== Epoch#49 =====

Question: What do you notice about your generated samples and how might you improve this model?

When you answer this question, consider the following factors:

  • The dataset is biased; it is made of "celebrity" faces that are mostly white
  • Model size; larger models have the opportunity to learn more features in a data feature space
  • Optimization strategy; optimizers and number of epochs affect your final result

Answer:

  • The generated samples are quite bright and this likely due to the biased dataset that we are using since it is made of "celebrity faces' that are mostly white. We would need more training images with variety of skin types
  • The generated samples are very pixelated and have fairly low-resolution. Using larger images to train the model will help improving a quality of the generated samples but will also increase a model size as well.
  • There are a few samples that still look as fake in the last epoch (Epoch #49) for example, the 2nd and 7th images in the first row and the 5th, 6th, 7th images in the second row. Since this model is trained with only 50 epochs, as we can on how the model was progressing in each epoch, increasing a number of epochs will likely help improving the generated samples to look more realistic

Submitting This Project

When submitting this project, make sure to run all the cells before saving the notebook. Save the notebook file as "dlnd_face_generation.ipynb" and save it as a HTML file under "File" -> "Download as". Include the "problem_unittests.py" files in your submission.